datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
yzhuang/autotree_nnxor_l1_54 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence:
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 13735600000
num_examples: 100000
- name: validation
num_bytes: 1373560000
num_examples: 10000
- name: test
num_bytes: 1373560000
num_examples: 10000
download_size: 14863203173
dataset_size: 16482720000
---
# Dataset Card for "autotree_nnxor_l1_54"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GEM-submissions/lewtun__this-is-a-test__1647246406 | ---
benchmark: gem
type: prediction
submission_name: This is a test
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test
|
deepghs/ai_image_corrupted | ---
license: openrail
task_categories:
- image-classification
tags:
- art
size_categories:
- 100K<n<1M
---
Used to detect severely stylistically corrupted anime images generated by Stable Diffusion.
There are two classes: `corrupted` and `normal`, comprising 61,100 and 63,004 images, respectively.
For the `corrupted` type, it includes several cases such as:
* Entirely black images
* Mosaics or stylistic anomalies due to low sampling steps
* Stylistic anomalies due to excessively weighted tags
* Stylistic anomalies due to embedding with excessive weights
It's important to note that the following content is not included:
* Distortions in facial and hand details caused by resolution issues
* Distortions in human and object structures
For the `normal` type, it contains regular AI-generated images as well as approximately 15,000 images hand-drawn by humans.
|
Seongill/NQ_5_missing_adv_top7 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: has_answer
dtype: bool
- name: similar_sub
dtype: string
- name: ctxs
list:
- name: answer_sent
sequence: string
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: is_adv
dtype: bool
- name: new_answer_sent
dtype: string
- name: original_text
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: status
dtype: string
splits:
- name: train
num_bytes: 14540839
num_examples: 3610
download_size: 8130563
dataset_size: 14540839
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
susnato/java_PRs | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: pr_number
dtype: int64
- name: pr_title
dtype: string
- name: pr_description
dtype: string
- name: author
dtype: string
- name: date_created
dtype: timestamp[ns, tz=UTC]
- name: date_merged
dtype: timestamp[ns, tz=UTC]
- name: previous_commit
dtype: string
- name: pr_commit
dtype: string
- name: query
dtype: string
- name: filepath
dtype: string
- name: before_content
dtype: string
- name: after_content
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 27628313941
num_examples: 612908
download_size: 17654080243
dataset_size: 27628313941
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
forta/token-impersonation-dataset | ---
license: mit
---
# Token Impersonation Dataset
This dataset contains 375 erc-like token impersonation contracts used for phishing scams and 85,716 legitimate Etherscan verified contracts.
The dataset includes the following data attributes:
* contract_address: smart contract address on Ethereum
* contract_creation_tx: smart contract deployment tx
* malicious: boolean flag whether a contract is a token impersonation contract or not
* creation_bytecode: smart contract bytecode that includes both contract initialization and execution code
* contract_creator_etherscan_label: contract creator's Etherscan label
* decompiled_opcodes: bytecode decompiled into EVM opcodes
* contract_tag: contract's Etherscan wallet tag
* contract_creator_tag: contract creator's Etherscan wallet tag
|
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test-mathemakitt-596cbd-1668659065 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test
eval_info:
task: text_zero_shot_classification
model: facebook/opt-13b
metrics: ['f1', 'perplexity']
dataset_name: mathemakitten/winobias_antistereotype_test
dataset_config: mathemakitten--winobias_antistereotype_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-13b
* Dataset: mathemakitten/winobias_antistereotype_test
* Config: mathemakitten--winobias_antistereotype_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ddcas](https://huggingface.co/ddcas) for evaluating this model. |
sriramahesh2000/legalDocument | ---
license: apache-2.0
---
|
kevind13/vuejs-nuxt-tailwind-codellama | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4453632
num_examples: 711
download_size: 1404507
dataset_size: 4453632
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-futin__guess-vi-f50546-2087567168 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: bigscience/bloomz-1b1
metrics: []
dataset_name: futin/guess
dataset_config: vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloomz-1b1
* Dataset: futin/guess
* Config: vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
vsrinivas/llamini_docs_splitdata | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1846734.3
num_examples: 1260
- name: test
num_bytes: 205192.7
num_examples: 140
download_size: 695218
dataset_size: 2051927.0
---
# Dataset Card for "llamini_docs_splitdata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DigitalUmuganda/AfriVoice | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
multilinguality:
- multilingual
language:
- sn
- ln
license: cc-by-4.0
---
# Dataset Card for the image text and voice dataset
## Dataset Description
### Dataset Summary
This dataset consists of a unique JPEG image, a corresponding audio WAV file describing the image, and when available, the transcription of the audio file.
The Shona dataset has a total of 574.16 hours of audio; out of which, 99.22 hours have transcriptions and the remaining 474.93 hours do not.
For Lingala, the dataset is 348.35 hours long, with 137.92 hours transcribed and 210.42 hours with no transcriptions, The lingala dataset shall be updated at a later date as it is still in progress and is undergoing some Quality assurance, once finished we shall update the link.
### Languages
```
Shona, Lingala
```
## How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
To download the config, specify the language code (i.e., "sn" for shona, and "ln" for lingala):
```python
from datasets import load_dataset
data = load_dataset("DigitalUmuganda/image_text_voice_dataset", "sn")
```
## Dataset Structure
### Data Instances
```python
{'creator': 'digital_umuganda',
'project_name': 'shona_data_collection',
'speaker_id': '2Eud8lyLlsMcciYhmlkwVRtBwi82',
'audio_path': '/root/.cache/huggingface/datasets/downloads/extracted/9347eb035e3ae38aaf793efa152ba1c93a4336471afce2bbd00ac8c0f67e9066/small_data/audio/I7L1YJVKIRL4.wav',
'image_path': '/root/.cache/huggingface/datasets/downloads/extracted/9347eb035e3ae38aaf793efa152ba1c93a4336471afce2bbd00ac8c0f67e9066/small_data/image/I7L1YJVKIRL4.jpeg',
'transcription': 'Varume vaviri vari kukandirana bhora. Varume ava vakapfeka zvipika zvine ruvara rutema neruchena. Zvikabudura zvine ruvara rutema. Bhora ravanokandirana rine ruvara rweyero neruchena nerwebhuruu. Vari kutambira munhandare ine ivhu. Kumashure kwavo kwakagara vanhu.',
'locale': 'sn_ZW',
'gender': 'Female',
'age': ' ',
'year': '2023'}
```
### Data Fields
`creator` (`string`): An id for which client (voice) made the recording
`image_path` (`string`): The path to the audio file
`path_audio` (`string`): The path to the image file
`transcription` (`string`): The sentence the user was prompted to speak
`age` (`string`): The age of the speaker
`gender` (`string`): The gender of the speaker
`project_name` (`string`): Name of the project
`locale` (`string`): The locale of the speaker
`year` (`string`): Year of recording
### Data Splits
Currently to data not yet split ie to access you must precise the train option, however the dataset will be split into train, dev, and test at some point in the future. |
alfredplpl/wikipedia-simple-ja-15k | ---
language:
- ja
license: cc-by-sa-3.0
task_categories:
- summarization
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3874112
num_examples: 15494
download_size: 2024204
dataset_size: 3874112
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wikipedia-simple-ja-15k"
This dataset is made of hpprc/wikipedia-20240101 . |
CyberHarem/ubel_sousounofrieren | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Γbel/γ¦γΌγγ« (Sousou No Frieren)
This is the dataset of Γbel/γ¦γΌγγ« (Sousou No Frieren), containing 119 images and their tags.
The core tags of this character are `green_hair, long_hair, hair_between_eyes, purple_eyes, side_ponytail, breasts, ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 119 | 80.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ubel_sousounofrieren/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 119 | 80.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ubel_sousounofrieren/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 217 | 137.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ubel_sousounofrieren/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ubel_sousounofrieren',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, anime_coloring, bare_shoulders, closed_mouth, collarbone, solo, forest, outdoors, tree, choker, o-ring, portrait, smile, bush |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, closed_mouth, collarbone, looking_at_viewer, solo, anime_coloring, choker, o-ring, smile, upper_body |
| 2 | 10 |  |  |  |  |  | 1girl, black_dress, closed_mouth, solo, bare_shoulders, black_belt, arm_strap, choker, black_gloves, looking_at_viewer, sleeveless, smile, armlet, pleated_dress, standing, thigh_strap |
| 3 | 15 |  |  |  |  |  | 1girl, solo, black_dress, bare_shoulders, black_gloves, smile, belt, arm_strap, green_eyes, holding_polearm, choker, closed_mouth, elbow_gloves, single_glove, thigh_strap, from_side, outdoors, pleated_dress, profile |
| 4 | 8 |  |  |  |  |  | 1girl, black_dress, armlet, bare_shoulders, closed_mouth, forest, outdoors, solo, tree, collarbone, choker, upper_body, black_gloves, holding_polearm, spear, black_belt, looking_to_the_side, o-ring |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | anime_coloring | bare_shoulders | closed_mouth | collarbone | solo | forest | outdoors | tree | choker | o-ring | portrait | smile | bush | looking_at_viewer | upper_body | black_dress | black_belt | arm_strap | black_gloves | sleeveless | armlet | pleated_dress | standing | thigh_strap | belt | green_eyes | holding_polearm | elbow_gloves | single_glove | from_side | profile | spear | looking_to_the_side |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------------|:---------------|:-------------|:-------|:---------|:-----------|:-------|:---------|:---------|:-----------|:--------|:-------|:--------------------|:-------------|:--------------|:-------------|:------------|:---------------|:-------------|:---------|:----------------|:-----------|:--------------|:-------|:-------------|:------------------|:---------------|:---------------|:------------|:----------|:--------|:----------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | | X | X | | X | | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | | X | X | | X | | | | X | | | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 3 | 15 |  |  |  |  |  | X | | X | X | | X | | X | | X | | | X | | | | X | | X | X | | | X | | X | X | X | X | X | X | X | X | | |
| 4 | 8 |  |  |  |  |  | X | | X | X | X | X | X | X | X | X | X | | | | | X | X | X | | X | | X | | | | | | X | | | | | X | X |
|
Multimodal-Fatima/Imagenette_train | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': tench
'1': English springer
'2': cassette player
'3': chain saw
'4': church
'5': French horn
'6': garbage truck
'7': gas pump
'8': golf ball
'9': parachute
- name: id
dtype: int64
splits:
- name: train
num_bytes: 1104913038.331
num_examples: 9469
download_size: 0
dataset_size: 1104913038.331
---
# Dataset Card for "Imagenette_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mr-tydi_ar_test | ---
pretty_name: '`mr-tydi/ar/test`'
viewer: false
source_datasets: ['irds/mr-tydi_ar']
task_categories:
- text-retrieval
---
# Dataset Card for `mr-tydi/ar/test`
The `mr-tydi/ar/test` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/ar/test).
# Data
This dataset provides:
- `queries` (i.e., topics); count=1,081
- `qrels`: (relevance assessments); count=1,257
- For `docs`, use [`irds/mr-tydi_ar`](https://huggingface.co/datasets/irds/mr-tydi_ar)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mr-tydi_ar_test', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mr-tydi_ar_test', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in π€ Dataset format.
## Citation Information
```
@article{Zhang2021MrTyDi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
@article{Clark2020TyDiQa,
title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki},
year={2020},
journal={Transactions of the Association for Computational Linguistics}
}
```
|
open-llm-leaderboard/details_BELLE-2__BELLE-Llama2-13B-chat-0.4M | ---
pretty_name: Evaluation run of BELLE-2/BELLE-Llama2-13B-chat-0.4M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BELLE-2/BELLE-Llama2-13B-chat-0.4M](https://huggingface.co/BELLE-2/BELLE-Llama2-13B-chat-0.4M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BELLE-2__BELLE-Llama2-13B-chat-0.4M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T14:37:54.228887](https://huggingface.co/datasets/open-llm-leaderboard/details_BELLE-2__BELLE-Llama2-13B-chat-0.4M/blob/main/results_2023-10-26T14-37-54.228887.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.28492030201342283,\n\
\ \"em_stderr\": 0.004622517599527834,\n \"f1\": 0.36695364932886015,\n\
\ \"f1_stderr\": 0.004514579216323901,\n \"acc\": 0.4496880334950361,\n\
\ \"acc_stderr\": 0.010877118313612513\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.28492030201342283,\n \"em_stderr\": 0.004622517599527834,\n\
\ \"f1\": 0.36695364932886015,\n \"f1_stderr\": 0.004514579216323901\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14404852160727824,\n \
\ \"acc_stderr\": 0.009672110973065286\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n\
\ }\n}\n```"
repo_url: https://huggingface.co/BELLE-2/BELLE-Llama2-13B-chat-0.4M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T14_37_54.228887
path:
- '**/details_harness|drop|3_2023-10-26T14-37-54.228887.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T14-37-54.228887.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T14_37_54.228887
path:
- '**/details_harness|gsm8k|5_2023-10-26T14-37-54.228887.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T14-37-54.228887.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T14_37_54.228887
path:
- '**/details_harness|winogrande|5_2023-10-26T14-37-54.228887.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T14-37-54.228887.parquet'
- config_name: results
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- results_2023-10-01T13-36-40.123057.parquet
- split: 2023_10_26T14_37_54.228887
path:
- results_2023-10-26T14-37-54.228887.parquet
- split: latest
path:
- results_2023-10-26T14-37-54.228887.parquet
---
# Dataset Card for Evaluation run of BELLE-2/BELLE-Llama2-13B-chat-0.4M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/BELLE-2/BELLE-Llama2-13B-chat-0.4M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [BELLE-2/BELLE-Llama2-13B-chat-0.4M](https://huggingface.co/BELLE-2/BELLE-Llama2-13B-chat-0.4M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BELLE-2__BELLE-Llama2-13B-chat-0.4M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T14:37:54.228887](https://huggingface.co/datasets/open-llm-leaderboard/details_BELLE-2__BELLE-Llama2-13B-chat-0.4M/blob/main/results_2023-10-26T14-37-54.228887.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.28492030201342283,
"em_stderr": 0.004622517599527834,
"f1": 0.36695364932886015,
"f1_stderr": 0.004514579216323901,
"acc": 0.4496880334950361,
"acc_stderr": 0.010877118313612513
},
"harness|drop|3": {
"em": 0.28492030201342283,
"em_stderr": 0.004622517599527834,
"f1": 0.36695364932886015,
"f1_stderr": 0.004514579216323901
},
"harness|gsm8k|5": {
"acc": 0.14404852160727824,
"acc_stderr": 0.009672110973065286
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
imsoumyaneel/sentiment-analysis-llama2 | ---
task_categories:
- text-classification
tags:
- code
size_categories:
- 10K<n<100K
--- |
coref-data/gum_indiscrim | ---
dataset_info:
- config_name: ontogum
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: string
- name: text
dtype: string
- name: tokens
list:
- name: deprel
dtype: string
- name: deps
dtype: string
- name: feats
dtype: string
- name: head
dtype: int64
- name: id
dtype: int64
- name: lemma
dtype: string
- name: misc
dtype: string
- name: text
dtype: string
- name: upos
dtype: string
- name: xpos
dtype: string
- name: misc
struct:
- name: parse_tree
dtype: string
- name: id
dtype: string
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: genre
dtype: string
- name: meta_data
struct:
- name: comment
dtype: string
splits:
- name: train
num_bytes: 23472505
num_examples: 165
- name: validation
num_bytes: 3119527
num_examples: 24
- name: test
num_bytes: 3180699
num_examples: 24
download_size: 7424694
dataset_size: 29772731
- config_name: original
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: string
- name: text
dtype: string
- name: tokens
list:
- name: deprel
dtype: string
- name: feats
dtype: string
- name: head
dtype: int64
- name: id
dtype: float64
- name: lemma
dtype: string
- name: misc
dtype: string
- name: text
dtype: string
- name: upos
dtype: string
- name: xpos
dtype: string
- name: misc
struct:
- name: parse_tree
dtype: string
- name: id
dtype: string
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: genre
dtype: string
- name: meta_data
struct:
- name: comment
dtype: string
splits:
- name: train
num_bytes: 22369183
num_examples: 165
- name: validation
num_bytes: 2970347
num_examples: 24
- name: test
num_bytes: 3038551
num_examples: 24
download_size: 7048887
dataset_size: 28378081
configs:
- config_name: ontogum
data_files:
- split: train
path: ontogum/train-*
- split: validation
path: ontogum/validation-*
- split: test
path: ontogum/test-*
- config_name: original
data_files:
- split: train
path: original/train-*
- split: validation
path: original/validation-*
- split: test
path: original/test-*
---
This dataset was generated by reformatting [`coref-data/gum_raw`](https://huggingface.co/datasets/coref-data/gum_raw) into the indiscrim coreference format. See that repo for dataset details.
See [ianporada/coref-data](https://github.com/ianporada/coref-data) for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
|
crabz/stsb-sk | ---
annotations_creators:
- other
language_creators:
- other
language:
- sk
language_bcp47:
- sk-SK
license:
- unknown
multilinguality:
- monolingual
pretty_name: stsb-sk
size_categories:
- 1K<n<10K
source_datasets:
- extended|stsb_multi_mt
task_categories:
- text-scoring
task_ids:
- semantic-similarity-scoring
---
Retrieving the 50th example from the train set:
```
> print(dataset['train']['sentence1'][0][50])
MuΕΎ hrΓ‘ na gitare.
> print(dataset['train']['sentence2'][0][50])
Chlapec hrΓ‘ na gitare.
> print(dataset['train']['similarity_score'][0][50])
3.200000047683716
```
For score explanation see [stsb_multi_mt](https://huggingface.co/datasets/stsb_multi_mt).
|
C-MTEB/CovidRetrieval | ---
configs:
- config_name: default
data_files:
- split: corpus
path: data/corpus-*
- split: queries
path: data/queries-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_bytes: 91531256
num_examples: 100001
- name: queries
num_bytes: 111094
num_examples: 949
download_size: 65093081
dataset_size: 91642350
---
# Dataset Card for "CovidRetrieval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python3-standardized_cluster_3_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 17136453
num_examples: 10062
download_size: 0
dataset_size: 17136453
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_3_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atutej/sentiment | ---
dataset_info:
- config_name: translation-ar
features:
- name: GENERIC CATEGORIES
dtype: string
- name: CATEGORY
dtype: string
- name: SUB-CATEGORY
dtype: string
- name: PRODUCT
dtype: string
- name: BRAND
dtype: string
- name: ASPECTS
dtype: string
- name: ASPECT COMBO
dtype: string
- name: ENGLISH REVIEW
dtype: string
- name: LABEL
dtype: string
- name: TARGET_REVIEW
dtype: string
splits:
- name: validation
num_bytes: 85865
num_examples: 156
- name: test
num_bytes: 552338
num_examples: 1000
download_size: 305412
dataset_size: 638203
- config_name: translation-en
features:
- name: GENERIC CATEGORIES
dtype: string
- name: CATEGORY
dtype: string
- name: SUB-CATEGORY
dtype: string
- name: PRODUCT
dtype: string
- name: BRAND
dtype: string
- name: ASPECTS
dtype: string
- name: ASPECT COMBO
dtype: string
- name: ENGLISH REVIEW
dtype: string
- name: LABEL
dtype: string
- name: TARGET_REVIEW
dtype: string
splits:
- name: validation
num_bytes: 75042
num_examples: 156
- name: test
num_bytes: 484811
num_examples: 1000
download_size: 281328
dataset_size: 559853
- config_name: translation-tr
features:
- name: GENERIC CATEGORIES
dtype: string
- name: CATEGORY
dtype: string
- name: SUB-CATEGORY
dtype: string
- name: PRODUCT
dtype: string
- name: BRAND
dtype: string
- name: ASPECTS
dtype: string
- name: ASPECT COMBO
dtype: string
- name: ENGLISH REVIEW
dtype: string
- name: LABEL
dtype: string
- name: TARGET_REVIEW
dtype: string
splits:
- name: validation
num_bytes: 76342
num_examples: 156
- name: test
num_bytes: 491251
num_examples: 1000
download_size: 284425
dataset_size: 567593
- config_name: transliteration-hi
features:
- name: GENERIC CATEGORIES
dtype: string
- name: CATEGORY
dtype: string
- name: SUB-CATEGORY
dtype: string
- name: PRODUCT
dtype: string
- name: BRAND
dtype: string
- name: ASPECTS
dtype: string
- name: ASPECT COMBO
dtype: string
- name: ENGLISH REVIEW
dtype: string
- name: LABEL
dtype: string
- name: INDIC REVIEW
dtype: string
- name: TARGET_REVIEW
dtype: string
splits:
- name: validation
num_bytes: 130962
num_examples: 156
- name: test
num_bytes: 839305
num_examples: 1000
download_size: 452178
dataset_size: 970267
configs:
- config_name: translation-ar
data_files:
- split: validation
path: translation-ar/validation-*
- split: test
path: translation-ar/test-*
- config_name: translation-en
data_files:
- split: validation
path: translation-en/validation-*
- split: test
path: translation-en/test-*
- config_name: translation-tr
data_files:
- split: validation
path: translation-tr/validation-*
- split: test
path: translation-tr/test-*
- config_name: transliteration-hi
data_files:
- split: validation
path: transliteration-hi/validation-*
- split: test
path: transliteration-hi/test-*
---
|
KyleLin/LayoutPrompter | ---
license: mit
---
A collection of datasets used in [LayoutPrompter](https://arxiv.org/pdf/2311.06495.pdf) (NeurIPS2023).
Specifically, `publaynet` and `rico` are downloaded from [LayoutFormer++](https://huggingface.co/jzy124/LayoutFormer), `posterlayout` is downloaded from [DS-GAN](http://59.108.48.34/tiki/PosterLayout/), and `webui` is downloaded from [Parse-Then-Place](https://huggingface.co/datasets/KyleLin/Parse-Then-Place).
We sincerely thank them for the great work they do.
|
mila-intel/ProtST-GeneOntology-BP | ---
license: apache-2.0
---
|
joey234/mmlu-security_studies-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 6657
num_examples: 5
download_size: 10237
dataset_size: 6657
---
# Dataset Card for "mmlu-security_studies-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ali4546/ma | ---
license: afl-3.0
---
|
Rapando/kpitbl | ---
license: apache-2.0
---
|
CyberHarem/koshimizu_sachiko_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of koshimizu_sachiko/θΌΏζ°΄εΉΈε/μ½μλ―Έμ¦μ¬μΉμ½ (THE iDOLM@STER: Cinderella Girls)
This is the dataset of koshimizu_sachiko/θΌΏζ°΄εΉΈε/μ½μλ―Έμ¦μ¬μΉμ½ (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `short_hair, purple_hair, brown_eyes, hair_ornament, hairclip`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 540.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koshimizu_sachiko_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 339.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koshimizu_sachiko_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1183 | 712.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koshimizu_sachiko_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 491.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koshimizu_sachiko_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1183 | 962.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koshimizu_sachiko_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/koshimizu_sachiko_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, school_uniform, solo, :d, blush, looking_at_viewer, open_mouth, bow, hand_on_own_cheek |
| 1 | 9 |  |  |  |  |  | 1girl, school_uniform, solo, :d, blush, open_mouth, skirt_lift, black_thighhighs, bow, grey_hair, looking_at_viewer |
| 2 | 5 |  |  |  |  |  | 1girl, black_thighhighs, open_mouth, solo, wrist_cuffs, :d, black_wings, blush, looking_at_viewer, dress |
| 3 | 22 |  |  |  |  |  | 1girl, solo, blush, long_sleeves, simple_background, white_background, bangs, hair_intakes, looking_at_viewer, open_mouth, upper_body, shirt, yellow_bowtie, skirt, :d, hair_flaps |
| 4 | 24 |  |  |  |  |  | cleavage_cutout, collar, elbow_gloves, heart_cutout, 1girl, skirt, solo, chain, wings, cuffs, navel, :d, open_mouth, midriff, black_thighhighs, blush, grey_hair, looking_at_viewer, microphone, garter_straps, pinstripe_pattern |
| 5 | 5 |  |  |  |  |  | 1girl, heart, looking_at_viewer, puffy_short_sleeves, solo, white_gloves, witch_hat, bangs, bat_(animal), blush, frilled_skirt, hair_flaps, striped, thighhighs, :3, :d, center_frills, hair_intakes, jack-o'-lantern, open_mouth, pumpkin, boots, bowtie, cape, ghost, happy_halloween, high_heels, holding_wand, jewelry, mismatched_legwear, white_shirt |
| 6 | 10 |  |  |  |  |  | 1girl, bangs, blue_skirt, long_sleeves, pleated_skirt, suspender_skirt, white_shirt, center_frills, hair_flaps, solo, closed_mouth, collared_shirt, hair_intakes, simple_background, blush, :3, smile |
| 7 | 9 |  |  |  |  |  | 1girl, bangs, blue_skirt, collared_shirt, long_sleeves, suspender_skirt, white_shirt, blush, hair_flaps, simple_background, solo, white_background, hair_intakes, open_mouth, vertical-striped_skirt, :d, looking_at_viewer, center_frills, frilled_skirt, light_purple_hair, necktie, pleated_skirt, purple_ascot |
| 8 | 7 |  |  |  |  |  | 1girl, blush, nipples, smile, looking_at_viewer, navel, nude, pussy, small_breasts, solo, censored, lying, open_mouth |
| 9 | 5 |  |  |  |  |  | 1girl, demon_girl, demon_horns, demon_tail, smile, solo, bare_shoulders, blush, demon_wings, detached_sleeves, looking_at_viewer, black_thighhighs, dress, purple_wings, simple_background, skirt, detached_collar, heart, open_mouth, white_background |
| 10 | 6 |  |  |  |  |  | 1girl, blush, heart-shaped_pupils, navel, open_mouth, sweat, 1boy, :d, drooling, flat_chest, hetero, nipples, saliva, solo_focus, tears, happy_sex, penis, side-tie_bikini_bottom, torogao, vaginal, bar_censor, looking_at_viewer, on_back, spread_legs |
| 11 | 7 |  |  |  |  |  | 1girl, purple_dress, solo, bangs, bare_shoulders, black_gloves, looking_at_viewer, blush, hair_flower, black_hairband, black_wings, mini_crown, smile, yellow_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | school_uniform | solo | :d | blush | looking_at_viewer | open_mouth | bow | hand_on_own_cheek | skirt_lift | black_thighhighs | grey_hair | wrist_cuffs | black_wings | dress | long_sleeves | simple_background | white_background | bangs | hair_intakes | upper_body | shirt | yellow_bowtie | skirt | hair_flaps | cleavage_cutout | collar | elbow_gloves | heart_cutout | chain | wings | cuffs | navel | midriff | microphone | garter_straps | pinstripe_pattern | heart | puffy_short_sleeves | white_gloves | witch_hat | bat_(animal) | frilled_skirt | striped | thighhighs | :3 | center_frills | jack-o'-lantern | pumpkin | boots | bowtie | cape | ghost | happy_halloween | high_heels | holding_wand | jewelry | mismatched_legwear | white_shirt | blue_skirt | pleated_skirt | suspender_skirt | closed_mouth | collared_shirt | smile | vertical-striped_skirt | light_purple_hair | necktie | purple_ascot | nipples | nude | pussy | small_breasts | censored | lying | demon_girl | demon_horns | demon_tail | bare_shoulders | demon_wings | detached_sleeves | purple_wings | detached_collar | heart-shaped_pupils | sweat | 1boy | drooling | flat_chest | hetero | saliva | solo_focus | tears | happy_sex | penis | side-tie_bikini_bottom | torogao | vaginal | bar_censor | on_back | spread_legs | purple_dress | black_gloves | hair_flower | black_hairband | mini_crown | yellow_eyes |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:-------|:-----|:--------|:--------------------|:-------------|:------|:--------------------|:-------------|:-------------------|:------------|:--------------|:--------------|:--------|:---------------|:--------------------|:-------------------|:--------|:---------------|:-------------|:--------|:----------------|:--------|:-------------|:------------------|:---------|:---------------|:---------------|:--------|:--------|:--------|:--------|:----------|:-------------|:----------------|:--------------------|:--------|:----------------------|:---------------|:------------|:---------------|:----------------|:----------|:-------------|:-----|:----------------|:------------------|:----------|:--------|:---------|:-------|:--------|:------------------|:-------------|:---------------|:----------|:---------------------|:--------------|:-------------|:----------------|:------------------|:---------------|:-----------------|:--------|:-------------------------|:--------------------|:----------|:---------------|:----------|:-------|:--------|:----------------|:-----------|:--------|:-------------|:--------------|:-------------|:-----------------|:--------------|:-------------------|:---------------|:------------------|:----------------------|:--------|:-------|:-----------|:-------------|:---------|:---------|:-------------|:--------|:------------|:--------|:-------------------------|:----------|:----------|:-------------|:----------|:--------------|:---------------|:---------------|:--------------|:-----------------|:-------------|:--------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | X | X | X | | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 22 |  |  |  |  |  | X | | X | X | X | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 24 |  |  |  |  |  | X | | X | X | X | X | X | | | | X | X | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | X | X | X | X | | | | | | | | | | | | X | X | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | | X | | X | | | | | | | | | | | X | X | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | | X | X | X | X | X | | | | | | | | | X | X | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | X | X | X | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | X | | X | X | X | | | | X | | | | X | | X | X | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 11 | 7 |  |  |  |  |  | X | | X | | X | X | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
FanChen0116/19100_chat_64x_slot_pvi_base | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 746411
num_examples: 4096
- name: validation
num_bytes: 5405
num_examples: 32
- name: test
num_bytes: 5405
num_examples: 32
download_size: 0
dataset_size: 757221
---
# Dataset Card for "19100_chat_64x_slot_pvi_base"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/embedded_datasets_0822 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 14376321549
num_examples: 2865791
download_size: 14664637194
dataset_size: 14376321549
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "combined_embedded_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
grrthrth/sfacg_info | ---
license: apache-2.0
---
|
samitizerxu/mini-algae-rgb | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 15787445.414
num_examples: 4039
- name: test
num_bytes: 6040387.721
num_examples: 1521
download_size: 21439845
dataset_size: 21827833.135
---
# Dataset Card for "mini-algae-rgb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_davanstrien__TinyLlama-1.1B-Chat-v1.0-intel-dpo | ---
pretty_name: Evaluation run of davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo](https://huggingface.co/davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_davanstrien__TinyLlama-1.1B-Chat-v1.0-intel-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T15:50:10.127087](https://huggingface.co/datasets/open-llm-leaderboard/details_davanstrien__TinyLlama-1.1B-Chat-v1.0-intel-dpo/blob/main/results_2024-01-13T15-50-10.127087.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2576008328471532,\n\
\ \"acc_stderr\": 0.03077282985808684,\n \"acc_norm\": 0.25844020358409664,\n\
\ \"acc_norm_stderr\": 0.03151828137960803,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080501,\n \"mc2\": 0.37376752806468966,\n\
\ \"mc2_stderr\": 0.013846261711668974\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.34897610921501704,\n \"acc_stderr\": 0.013928933461382494,\n\
\ \"acc_norm\": 0.3583617747440273,\n \"acc_norm_stderr\": 0.014012883334859866\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45817566221868156,\n\
\ \"acc_stderr\": 0.004972293764978727,\n \"acc_norm\": 0.6129257120095598,\n\
\ \"acc_norm_stderr\": 0.0048608542408219695\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n\
\ \"acc_stderr\": 0.033027898599017176,\n \"acc_norm\": 0.17777777777777778,\n\
\ \"acc_norm_stderr\": 0.033027898599017176\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123387,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123387\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.18497109826589594,\n\
\ \"acc_stderr\": 0.02960562398177122,\n \"acc_norm\": 0.18497109826589594,\n\
\ \"acc_norm_stderr\": 0.02960562398177122\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.028659179374292323,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.028659179374292323\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.041857744240220575,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.041857744240220575\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.02300008685906864,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.02300008685906864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.024580028921481006,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.024580028921481006\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386407,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386407\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204416,\n\
\ \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204416\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380548,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343578,\n \"\
acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343578\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24050632911392406,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.24050632911392406,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.3542600896860987,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690875,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690875\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n\
\ \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.28205128205128205,\n\
\ \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28607918263090676,\n\
\ \"acc_stderr\": 0.016160871405127522,\n \"acc_norm\": 0.28607918263090676,\n\
\ \"acc_norm_stderr\": 0.016160871405127522\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225624,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225624\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113592,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113592\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.025218040373410622,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.025218040373410622\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872395,\n \
\ \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872395\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23402868318122555,\n\
\ \"acc_stderr\": 0.010813585552659693,\n \"acc_norm\": 0.23402868318122555,\n\
\ \"acc_norm_stderr\": 0.010813585552659693\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.02488097151229427,\n\
\ \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.02488097151229427\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.017883188134667192,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.017883188134667192\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1346938775510204,\n \"acc_stderr\": 0.021855658840811615,\n\
\ \"acc_norm\": 0.1346938775510204,\n \"acc_norm_stderr\": 0.021855658840811615\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.036471685236832266,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.036471685236832266\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080501,\n \"mc2\": 0.37376752806468966,\n\
\ \"mc2_stderr\": 0.013846261711668974\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6101026045777427,\n \"acc_stderr\": 0.013707547317008463\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \
\ \"acc_stderr\": 0.0038289829787357117\n }\n}\n```"
repo_url: https://huggingface.co/davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|arc:challenge|25_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|gsm8k|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hellaswag|10_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-50-10.127087.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T15-50-10.127087.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- '**/details_harness|winogrande|5_2024-01-13T15-50-10.127087.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T15-50-10.127087.parquet'
- config_name: results
data_files:
- split: 2024_01_13T15_50_10.127087
path:
- results_2024-01-13T15-50-10.127087.parquet
- split: latest
path:
- results_2024-01-13T15-50-10.127087.parquet
---
# Dataset Card for Evaluation run of davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo](https://huggingface.co/davanstrien/TinyLlama-1.1B-Chat-v1.0-intel-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_davanstrien__TinyLlama-1.1B-Chat-v1.0-intel-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:50:10.127087](https://huggingface.co/datasets/open-llm-leaderboard/details_davanstrien__TinyLlama-1.1B-Chat-v1.0-intel-dpo/blob/main/results_2024-01-13T15-50-10.127087.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2576008328471532,
"acc_stderr": 0.03077282985808684,
"acc_norm": 0.25844020358409664,
"acc_norm_stderr": 0.03151828137960803,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080501,
"mc2": 0.37376752806468966,
"mc2_stderr": 0.013846261711668974
},
"harness|arc:challenge|25": {
"acc": 0.34897610921501704,
"acc_stderr": 0.013928933461382494,
"acc_norm": 0.3583617747440273,
"acc_norm_stderr": 0.014012883334859866
},
"harness|hellaswag|10": {
"acc": 0.45817566221868156,
"acc_stderr": 0.004972293764978727,
"acc_norm": 0.6129257120095598,
"acc_norm_stderr": 0.0048608542408219695
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.033027898599017176,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.033027898599017176
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123387,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.02960562398177122,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.02960562398177122
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.028659179374292323,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.028659179374292323
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220575,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220575
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.02300008685906864,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.02300008685906864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020534,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020534
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481006,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481006
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386407,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386407
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26153846153846155,
"acc_stderr": 0.022282141204204416,
"acc_norm": 0.26153846153846155,
"acc_norm_stderr": 0.022282141204204416
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380548,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343578,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343578
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24050632911392406,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.24050632911392406,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3542600896860987,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.3542600896860987,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690875,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690875
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28607918263090676,
"acc_stderr": 0.016160871405127522,
"acc_norm": 0.28607918263090676,
"acc_norm_stderr": 0.016160871405127522
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22254335260115607,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.22254335260115607,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225624,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225624
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113592,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410622,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872395,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872395
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23402868318122555,
"acc_stderr": 0.010813585552659693,
"acc_norm": 0.23402868318122555,
"acc_norm_stderr": 0.010813585552659693
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.02488097151229427,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.02488097151229427
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.017883188134667192,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.017883188134667192
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1346938775510204,
"acc_stderr": 0.021855658840811615,
"acc_norm": 0.1346938775510204,
"acc_norm_stderr": 0.021855658840811615
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.036471685236832266,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.036471685236832266
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080501,
"mc2": 0.37376752806468966,
"mc2_stderr": 0.013846261711668974
},
"harness|winogrande|5": {
"acc": 0.6101026045777427,
"acc_stderr": 0.013707547317008463
},
"harness|gsm8k|5": {
"acc": 0.019711902956785442,
"acc_stderr": 0.0038289829787357117
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
astrin0321/TEam2Zip | ---
license: apache-2.0
---
|
itamarcard/veto | ---
license: openrail
---
|
keshan/amateur_drawings-controlnet-dataset | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: segment_image
dtype: image
- name: keypoint_image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 49810154042.961
num_examples: 177723
download_size: 50168061092
dataset_size: 49810154042.961
---
# Dataset Card for "amateur_drawings-controlnet-dataset"
WIP... Come back later.... |
AppleHarem/midori_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of midori (Blue Archive)
This is the dataset of midori (Blue Archive), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 556 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 676 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 556 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 556 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 518 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 676 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 676 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
CodecSR/librispeech_asr_test_48k_synth | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: original
num_bytes: 1238771045.0
num_examples: 5559
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 3713134600.566
num_examples: 5559
- name: academicodec_hifi_24k_320d
num_bytes: 3713134600.566
num_examples: 5559
- name: audiodec_24k_300d
num_bytes: 3715771086.566
num_examples: 5559
- name: audiodec_48k_300d_uni
num_bytes: 3715771086.566
num_examples: 5559
- name: dac_16k
num_bytes: 3714427152.566
num_examples: 5559
- name: dac_24k
num_bytes: 3714427156.566
num_examples: 5559
- name: dac_44k
num_bytes: 3714427158.566
num_examples: 5559
- name: encodec_24k_12bps
num_bytes: 3714427158.566
num_examples: 5559
- name: encodec_24k_1_5bps
num_bytes: 3714427152.566
num_examples: 5559
- name: encodec_24k_24bps
num_bytes: 3714427160.566
num_examples: 5559
- name: encodec_24k_3bps
num_bytes: 3714427152.566
num_examples: 5559
- name: encodec_24k_6bps
num_bytes: 3714427158.566
num_examples: 5559
- name: facodec_16k
num_bytes: 3713970686.566
num_examples: 5559
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 3714427160.566
num_examples: 5559
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 3714427158.566
num_examples: 5559
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 3714427160.566
num_examples: 5559
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 3714427158.566
num_examples: 5559
- name: language_codec_chinese_24k_nq8_12kbps
num_bytes: 3715715084.007
num_examples: 5559
- name: language_codec_paper_24k_nq8_12kbps
num_bytes: 3715715084.007
num_examples: 5559
- name: speech_tokenizer_16k
num_bytes: 3715715084.007
num_examples: 5559
download_size: 71657007047
dataset_size: 75530824246.64304
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_300d
path: data/audiodec_24k_300d-*
- split: audiodec_48k_300d_uni
path: data/audiodec_48k_300d_uni-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: facodec_16k
path: data/facodec_16k-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: language_codec_chinese_24k_nq8_12kbps
path: data/language_codec_chinese_24k_nq8_12kbps-*
- split: language_codec_paper_24k_nq8_12kbps
path: data/language_codec_paper_24k_nq8_12kbps-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
---
|
Minglii/ee15 | ---
dataset_info:
features:
- name: data
struct:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 5329018
num_examples: 7800
download_size: 3049837
dataset_size: 5329018
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ee15"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
instrumentalyogarelax/gilson001 | ---
license: openrail
---
|
mole-code/com.theokanning.openai-data | ---
dataset_info:
features:
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: train
num_bytes: 2942476
num_examples: 467
download_size: 927051
dataset_size: 2942476
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
canristiian/drug_rule_params | ---
license: apache-2.0
---
|
loubnabnl/python_comment_code_ratio_08 | ---
dataset_info:
features:
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
- name: licenses
sequence: string
- name: repository_name
dtype: string
- name: path
dtype: string
- name: size
dtype: int64
- name: lang
dtype: string
- name: nl_text
dtype: string
- name: nl_size
dtype: int64
- name: nl_ratio
dtype: float64
splits:
- name: train
num_bytes: 1272677.3664
num_examples: 131
download_size: 324517
dataset_size: 1272677.3664
---
# Dataset Card for "python_comment_code_ratio_08"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_217 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 21216042720.75
num_examples: 220890
download_size: 20002988350
dataset_size: 21216042720.75
---
# Dataset Card for "chunk_217"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Atipico1/trivia_test | ---
dataset_info:
- config_name: adversary
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: gpt_answer_sentence
dtype: string
- name: gpt_adv_sentence
sequence: string
- name: is_valid_adv_sentence
dtype: bool
- name: gpt_adv_passage
sequence: string
- name: is_valid_adv_passage
dtype: bool
splits:
- name: train
num_bytes: 91910594
num_examples: 11313
download_size: 52541960
dataset_size: 91910594
- config_name: adversary_v2
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: gpt_answer_sentence
dtype: string
- name: gpt_adv_sentence
sequence: string
- name: is_valid_adv_sentence
dtype: bool
- name: gpt_adv_passage
sequence: string
- name: is_valid_adv_passage
dtype: bool
splits:
- name: train
num_bytes: 91910491
num_examples: 11313
download_size: 52546819
dataset_size: 91910491
- config_name: adversary_v2-sent
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: gpt_answer_sentence
dtype: string
- name: gpt_adv_sentence
sequence: string
- name: is_valid_adv_sentence
dtype: bool
- name: gpt_adv_passage
sequence: string
- name: is_valid_adv_passage
dtype: bool
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float32
- name: text
dtype: string
splits:
- name: train
num_bytes: 35782242
num_examples: 11313
download_size: 20210643
dataset_size: 35782242
- config_name: conflict
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: gpt_answer_sentence
dtype: string
- name: entity_type
dtype: string
- name: similar_entity
dtype: string
- name: similar_entity_score
dtype: float32
- name: random_entity
dtype: string
- name: random_entity_score
dtype: float64
splits:
- name: train
num_bytes: 79041831
num_examples: 11313
download_size: 45974504
dataset_size: 79041831
- config_name: conflict_v1
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: gpt_answer_sentence
dtype: string
- name: entity_type
dtype: string
- name: similar_entity
dtype: string
- name: similar_entity_score
dtype: float32
- name: random_entity
dtype: string
- name: random_entity_score
dtype: float64
- name: gpt_conflict_sentence
sequence: string
- name: is_valid_conflict_sentence
dtype: bool
- name: gpt_conflict_passage
sequence: string
- name: is_valid_conflict_passage
dtype: bool
splits:
- name: train
num_bytes: 82500749
num_examples: 11313
download_size: 48085357
dataset_size: 82500749
- config_name: conflict_v1-sent
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: gpt_answer_sentence
dtype: string
- name: entity_type
dtype: string
- name: similar_entity
dtype: string
- name: similar_entity_score
dtype: float32
- name: random_entity
dtype: string
- name: random_entity_score
dtype: float64
- name: gpt_conflict_sentence
sequence: string
- name: is_valid_conflict_sentence
dtype: bool
- name: gpt_conflict_passage
sequence: string
- name: is_valid_conflict_passage
dtype: bool
- name: hasanswer
dtype: bool
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float32
- name: text
dtype: string
splits:
- name: train
num_bytes: 17992699
num_examples: 11313
download_size: 11026959
dataset_size: 17992699
- config_name: default
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 77273159
num_examples: 11313
download_size: 44781875
dataset_size: 77273159
configs:
- config_name: adversary
data_files:
- split: train
path: adversary/train-*
- config_name: adversary_v2
data_files:
- split: train
path: adversary_v2/train-*
- config_name: adversary_v2-sent
data_files:
- split: train
path: adversary_v2-sent/train-*
- config_name: conflict
data_files:
- split: train
path: conflict/train-*
- config_name: conflict_v1
data_files:
- split: train
path: conflict_v1/train-*
- config_name: conflict_v1-sent
data_files:
- split: train
path: conflict_v1-sent/train-*
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_AA051610__testtest | ---
pretty_name: Evaluation run of AA051610/testtest
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/testtest](https://huggingface.co/AA051610/testtest) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__testtest\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T11:01:46.425546](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__testtest/blob/main/results_2024-01-06T11-01-46.425546.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7617892127804575,\n\
\ \"acc_stderr\": 0.028264083954934467,\n \"acc_norm\": 0.7670102557925514,\n\
\ \"acc_norm_stderr\": 0.028787360961551947,\n \"mc1\": 0.5348837209302325,\n\
\ \"mc1_stderr\": 0.017460849975873972,\n \"mc2\": 0.6990494623892585,\n\
\ \"mc2_stderr\": 0.014341231959910994\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6860068259385665,\n \"acc_stderr\": 0.013562691224726297,\n\
\ \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403515\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6546504680342561,\n\
\ \"acc_stderr\": 0.0047451035439012934,\n \"acc_norm\": 0.8488348934475204,\n\
\ \"acc_norm_stderr\": 0.0035747765941085037\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.7481481481481481,\n \"acc_stderr\": 0.03749850709174021,\n\
\ \"acc_norm\": 0.7481481481481481,\n \"acc_norm_stderr\": 0.03749850709174021\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.868421052631579,\n\
\ \"acc_stderr\": 0.027508689533549912,\n \"acc_norm\": 0.868421052631579,\n\
\ \"acc_norm_stderr\": 0.027508689533549912\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.024618298195866518,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.024618298195866518\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.9027777777777778,\n \"acc_stderr\": 0.024774516250440182,\n\
\ \"acc_norm\": 0.9027777777777778,\n \"acc_norm_stderr\": 0.024774516250440182\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5686274509803921,\n\
\ \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.5686274509803921,\n\
\ \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7872340425531915,\n\
\ \"acc_stderr\": 0.026754391348039783,\n \"acc_norm\": 0.7872340425531915,\n\
\ \"acc_norm_stderr\": 0.026754391348039783\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.046446020912223177,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.046446020912223177\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.7655172413793103,\n \"acc_stderr\": 0.035306258743465914,\n \"\
acc_norm\": 0.7655172413793103,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7116402116402116,\n \"acc_stderr\": 0.023330654054535896,\n \"\
acc_norm\": 0.7116402116402116,\n \"acc_norm_stderr\": 0.023330654054535896\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.626984126984127,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.626984126984127,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n\
\ \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n\
\ \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.645320197044335,\n \"acc_stderr\": 0.03366124489051449,\n\
\ \"acc_norm\": 0.645320197044335,\n \"acc_norm_stderr\": 0.03366124489051449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706456,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706456\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"\
acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.019982347208637292,\n\
\ \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.019982347208637292\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4444444444444444,\n \"acc_stderr\": 0.030296771286067323,\n \
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.030296771286067323\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8319327731092437,\n \"acc_stderr\": 0.024289102115692265,\n\
\ \"acc_norm\": 0.8319327731092437,\n \"acc_norm_stderr\": 0.024289102115692265\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9155963302752294,\n \"acc_stderr\": 0.01191881932733488,\n \"\
acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.01191881932733488\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"\
acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \
\ \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342323,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342323\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540637,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540637\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n\
\ \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n\
\ \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n\
\ \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640406,\n\
\ \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640406\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n\
\ \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n\
\ \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9029374201787995,\n\
\ \"acc_stderr\": 0.010586474712018297,\n \"acc_norm\": 0.9029374201787995,\n\
\ \"acc_norm_stderr\": 0.010586474712018297\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.815028901734104,\n \"acc_stderr\": 0.02090397584208303,\n\
\ \"acc_norm\": 0.815028901734104,\n \"acc_norm_stderr\": 0.02090397584208303\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8044692737430168,\n\
\ \"acc_stderr\": 0.013264579220945105,\n \"acc_norm\": 0.8044692737430168,\n\
\ \"acc_norm_stderr\": 0.013264579220945105\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.02046417512433263,\n\
\ \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.02046417512433263\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8038585209003215,\n\
\ \"acc_stderr\": 0.022552447780478022,\n \"acc_norm\": 0.8038585209003215,\n\
\ \"acc_norm_stderr\": 0.022552447780478022\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571842,\n\
\ \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6347517730496454,\n \"acc_stderr\": 0.028723863853281267,\n \
\ \"acc_norm\": 0.6347517730496454,\n \"acc_norm_stderr\": 0.028723863853281267\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5808344198174706,\n\
\ \"acc_stderr\": 0.012602244505788228,\n \"acc_norm\": 0.5808344198174706,\n\
\ \"acc_norm_stderr\": 0.012602244505788228\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559345,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559345\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.815359477124183,\n \"acc_stderr\": 0.01569702924075778,\n \
\ \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.01569702924075778\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.02366169917709861,\n\
\ \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.02366169917709861\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355034,\n\
\ \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355034\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5348837209302325,\n\
\ \"mc1_stderr\": 0.017460849975873972,\n \"mc2\": 0.6990494623892585,\n\
\ \"mc2_stderr\": 0.014341231959910994\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047987\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \
\ \"acc_stderr\": 0.013428382481274233\n }\n}\n```"
repo_url: https://huggingface.co/AA051610/testtest
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|arc:challenge|25_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|gsm8k|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hellaswag|10_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T11-01-46.425546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T11-01-46.425546.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- '**/details_harness|winogrande|5_2024-01-06T11-01-46.425546.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T11-01-46.425546.parquet'
- config_name: results
data_files:
- split: 2024_01_06T11_01_46.425546
path:
- results_2024-01-06T11-01-46.425546.parquet
- split: latest
path:
- results_2024-01-06T11-01-46.425546.parquet
---
# Dataset Card for Evaluation run of AA051610/testtest
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051610/testtest](https://huggingface.co/AA051610/testtest) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__testtest",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T11:01:46.425546](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__testtest/blob/main/results_2024-01-06T11-01-46.425546.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7617892127804575,
"acc_stderr": 0.028264083954934467,
"acc_norm": 0.7670102557925514,
"acc_norm_stderr": 0.028787360961551947,
"mc1": 0.5348837209302325,
"mc1_stderr": 0.017460849975873972,
"mc2": 0.6990494623892585,
"mc2_stderr": 0.014341231959910994
},
"harness|arc:challenge|25": {
"acc": 0.6860068259385665,
"acc_stderr": 0.013562691224726297,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.013284525292403515
},
"harness|hellaswag|10": {
"acc": 0.6546504680342561,
"acc_stderr": 0.0047451035439012934,
"acc_norm": 0.8488348934475204,
"acc_norm_stderr": 0.0035747765941085037
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.868421052631579,
"acc_stderr": 0.027508689533549912,
"acc_norm": 0.868421052631579,
"acc_norm_stderr": 0.027508689533549912
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.024618298195866518,
"acc_norm": 0.8,
"acc_norm_stderr": 0.024618298195866518
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.024774516250440182,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.024774516250440182
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7872340425531915,
"acc_stderr": 0.026754391348039783,
"acc_norm": 0.7872340425531915,
"acc_norm_stderr": 0.026754391348039783
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7655172413793103,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.7655172413793103,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7116402116402116,
"acc_stderr": 0.023330654054535896,
"acc_norm": 0.7116402116402116,
"acc_norm_stderr": 0.023330654054535896
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.626984126984127,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.626984126984127,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.645320197044335,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.645320197044335,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706456,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706456
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.019982347208637292,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.019982347208637292
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.030296771286067323,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.030296771286067323
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8319327731092437,
"acc_stderr": 0.024289102115692265,
"acc_norm": 0.8319327731092437,
"acc_norm_stderr": 0.024289102115692265
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9155963302752294,
"acc_stderr": 0.01191881932733488,
"acc_norm": 0.9155963302752294,
"acc_norm_stderr": 0.01191881932733488
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342323,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342323
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540637,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540637
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.03176683948640406,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.03176683948640406
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446912,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9029374201787995,
"acc_stderr": 0.010586474712018297,
"acc_norm": 0.9029374201787995,
"acc_norm_stderr": 0.010586474712018297
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.815028901734104,
"acc_stderr": 0.02090397584208303,
"acc_norm": 0.815028901734104,
"acc_norm_stderr": 0.02090397584208303
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8044692737430168,
"acc_stderr": 0.013264579220945105,
"acc_norm": 0.8044692737430168,
"acc_norm_stderr": 0.013264579220945105
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.02046417512433263,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.02046417512433263
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8038585209003215,
"acc_stderr": 0.022552447780478022,
"acc_norm": 0.8038585209003215,
"acc_norm_stderr": 0.022552447780478022
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.018877353839571842,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.018877353839571842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6347517730496454,
"acc_stderr": 0.028723863853281267,
"acc_norm": 0.6347517730496454,
"acc_norm_stderr": 0.028723863853281267
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5808344198174706,
"acc_stderr": 0.012602244505788228,
"acc_norm": 0.5808344198174706,
"acc_norm_stderr": 0.012602244505788228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.023157468308559345,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.023157468308559345
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.01569702924075778,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.01569702924075778
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.02366169917709861,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.02366169917709861
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.022949025579355034,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.022949025579355034
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5348837209302325,
"mc1_stderr": 0.017460849975873972,
"mc2": 0.6990494623892585,
"mc2_stderr": 0.014341231959910994
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047987
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.013428382481274233
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MatsuoDochiai/Josias | ---
license: openrail
---
|
AP123/foryacine | ---
license: apache-2.0
---
|
galo959/g | ---
license: openrail
---
|
open-llm-leaderboard/details_beowolx__MistralHermes-CodePro-7B-v1 | ---
pretty_name: Evaluation run of beowolx/MistralHermes-CodePro-7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [beowolx/MistralHermes-CodePro-7B-v1](https://huggingface.co/beowolx/MistralHermes-CodePro-7B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beowolx__MistralHermes-CodePro-7B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T23:16:31.615360](https://huggingface.co/datasets/open-llm-leaderboard/details_beowolx__MistralHermes-CodePro-7B-v1/blob/main/results_2024-01-13T23-16-31.615360.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6355378468432605,\n\
\ \"acc_stderr\": 0.03226341558486178,\n \"acc_norm\": 0.6374815210840533,\n\
\ \"acc_norm_stderr\": 0.03291019935178123,\n \"mc1\": 0.3488372093023256,\n\
\ \"mc1_stderr\": 0.016684419859986893,\n \"mc2\": 0.4966549787597113,\n\
\ \"mc2_stderr\": 0.015039415129128687\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472435,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.01415063143511173\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.629555865365465,\n\
\ \"acc_stderr\": 0.004819367172685959,\n \"acc_norm\": 0.8268273252340171,\n\
\ \"acc_norm_stderr\": 0.0037762314890081123\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\"\
: 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455495,\n \"\
acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455495\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.01606005626853034,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.01606005626853034\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468348,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468348\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.014736926383761974,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.014736926383761974\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967294,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967294\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000318,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000318\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3488372093023256,\n\
\ \"mc1_stderr\": 0.016684419859986893,\n \"mc2\": 0.4966549787597113,\n\
\ \"mc2_stderr\": 0.015039415129128687\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643412\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6087945413191812,\n \
\ \"acc_stderr\": 0.013442502402794302\n }\n}\n```"
repo_url: https://huggingface.co/beowolx/MistralHermes-CodePro-7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|arc:challenge|25_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|gsm8k|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hellaswag|10_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T23-16-31.615360.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T23-16-31.615360.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- '**/details_harness|winogrande|5_2024-01-13T23-16-31.615360.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T23-16-31.615360.parquet'
- config_name: results
data_files:
- split: 2024_01_13T23_16_31.615360
path:
- results_2024-01-13T23-16-31.615360.parquet
- split: latest
path:
- results_2024-01-13T23-16-31.615360.parquet
---
# Dataset Card for Evaluation run of beowolx/MistralHermes-CodePro-7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [beowolx/MistralHermes-CodePro-7B-v1](https://huggingface.co/beowolx/MistralHermes-CodePro-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beowolx__MistralHermes-CodePro-7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T23:16:31.615360](https://huggingface.co/datasets/open-llm-leaderboard/details_beowolx__MistralHermes-CodePro-7B-v1/blob/main/results_2024-01-13T23-16-31.615360.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6355378468432605,
"acc_stderr": 0.03226341558486178,
"acc_norm": 0.6374815210840533,
"acc_norm_stderr": 0.03291019935178123,
"mc1": 0.3488372093023256,
"mc1_stderr": 0.016684419859986893,
"mc2": 0.4966549787597113,
"mc2_stderr": 0.015039415129128687
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472435,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.01415063143511173
},
"harness|hellaswag|10": {
"acc": 0.629555865365465,
"acc_stderr": 0.004819367172685959,
"acc_norm": 0.8268273252340171,
"acc_norm_stderr": 0.0037762314890081123
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455495,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455495
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.01606005626853034,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.01606005626853034
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468348,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761974,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761974
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967294,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967294
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000318,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000318
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3488372093023256,
"mc1_stderr": 0.016684419859986893,
"mc2": 0.4966549787597113,
"mc2_stderr": 0.015039415129128687
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643412
},
"harness|gsm8k|5": {
"acc": 0.6087945413191812,
"acc_stderr": 0.013442502402794302
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
giux78/50000-60900-ultrafeedback-binarized-preferences-cleaned-ita | ---
dataset_info:
features:
- name: source
dtype: string
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen-rating
dtype: float64
- name: chosen-model
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected-rating
dtype: float64
- name: rejected-model
dtype: string
splits:
- name: train
num_bytes: 97100085
num_examples: 10900
download_size: 48433446
dataset_size: 97100085
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "50000-60900-ultrafeedback-binarized-preferences-cleaned-ita"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Devendarreddy/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Thouph/formatted | ---
license: wtfpl
---
|
itt0lp/sabrinacarpenter | ---
license: openrail
---
|
FudanSELab/CodeGen4Libs_RetrievalCodeLib | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: method
dtype: string
- name: clean_method
dtype: string
- name: doc
dtype: string
- name: comment
dtype: string
- name: method_name
dtype: string
- name: extra
struct:
- name: repo_name
dtype: string
- name: path
dtype: string
- name: license
dtype: string
- name: size
dtype: int64
- name: imports
sequence: string
- name: imports_info
dtype: string
- name: cluster_imports_info
dtype: string
- name: libraries
sequence: string
- name: libraries_info
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 5373034269
num_examples: 2916582
download_size: 2492962682
dataset_size: 5373034269
tags:
- code-generation
pretty_name: 'CodeGen4Libs '
size_categories:
- 1M<n<10M
---
# Dataset Card for FudanSELab CodeGen4Libs Code Retrieval Library
## Dataset Description
- **Repository:** [GitHub Repository](https://github.com/FudanSELab/codegen4libs)
- **Paper:** [CodeGen4Libs: A Two-stage Approach for Library-oriented Code Generation](https://mingwei-liu.github.io/publication/2023-08-18-ase-CodeGen4Libs)
### Dataset Summary
This dataset is the code retrieval library used in the ASE2023 paper titled ["CodeGen4Libs: A Two-stage Approach for Library-oriented Code Generation"](https://mingwei-liu.github.io/publication/2023-08-18-ase-CodeGen4Libs).
## Additional Information
### Citation Information
```
@inproceedings{ase2023codegen4libs,
author = {Mingwei Liu and Tianyong Yang and Yiling Lou and Xueying Du and Ying Wang and and Xin Peng},
title = {{CodeGen4Libs}: A Two-stage Approach for Library-oriented Code Generation},
booktitle = {38th {IEEE/ACM} International Conference on Automated Software Engineering,
{ASE} 2023, Kirchberg, Luxembourg, September 11-15, 2023},
pages = {0--0},
publisher = {{IEEE}},
year = {2023},
}
``` |
open-llm-leaderboard/details_krevas__SOLAR-10.7B | ---
pretty_name: Evaluation run of krevas/SOLAR-10.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [krevas/SOLAR-10.7B](https://huggingface.co/krevas/SOLAR-10.7B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_krevas__SOLAR-10.7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T19:04:44.877346](https://huggingface.co/datasets/open-llm-leaderboard/details_krevas__SOLAR-10.7B/blob/main/results_2024-03-30T19-04-44.877346.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6256384530384593,\n\
\ \"acc_stderr\": 0.03222174705595155,\n \"acc_norm\": 0.6357523261247408,\n\
\ \"acc_norm_stderr\": 0.033105283857181055,\n \"mc1\": 0.6964504283965728,\n\
\ \"mc1_stderr\": 0.016095884155386847,\n \"mc2\": 0.8032546914700353,\n\
\ \"mc2_stderr\": 0.013304123687885194\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.01320319608853737,\n\
\ \"acc_norm\": 0.7431740614334471,\n \"acc_norm_stderr\": 0.012766923794116798\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7484564827723561,\n\
\ \"acc_stderr\": 0.004330134219762844,\n \"acc_norm\": 0.8904600677155945,\n\
\ \"acc_norm_stderr\": 0.003116771577319422\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.033550453048829226,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.033550453048829226\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642525,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334334,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334334\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.02424378399406217,\n \
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.02424378399406217\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551704,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551704\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.01659525971039932,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.01659525971039932\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.033888571185023246,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.033888571185023246\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801584,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801584\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \
\ \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572206,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572206\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644561,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560392,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560392\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.0250093137900697,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.0250093137900697\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n\
\ \"acc_stderr\": 0.016286674879101026,\n \"acc_norm\": 0.3865921787709497,\n\
\ \"acc_norm_stderr\": 0.016286674879101026\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488533,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488533\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613513,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613513\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5071707953063885,\n\
\ \"acc_stderr\": 0.012768922739553308,\n \"acc_norm\": 0.5071707953063885,\n\
\ \"acc_norm_stderr\": 0.012768922739553308\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681404,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681404\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6964504283965728,\n\
\ \"mc1_stderr\": 0.016095884155386847,\n \"mc2\": 0.8032546914700353,\n\
\ \"mc2_stderr\": 0.013304123687885194\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498431\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/krevas/SOLAR-10.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|arc:challenge|25_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|gsm8k|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hellaswag|10_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T19-04-44.877346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T19-04-44.877346.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- '**/details_harness|winogrande|5_2024-03-30T19-04-44.877346.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T19-04-44.877346.parquet'
- config_name: results
data_files:
- split: 2024_03_30T19_04_44.877346
path:
- results_2024-03-30T19-04-44.877346.parquet
- split: latest
path:
- results_2024-03-30T19-04-44.877346.parquet
---
# Dataset Card for Evaluation run of krevas/SOLAR-10.7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [krevas/SOLAR-10.7B](https://huggingface.co/krevas/SOLAR-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_krevas__SOLAR-10.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T19:04:44.877346](https://huggingface.co/datasets/open-llm-leaderboard/details_krevas__SOLAR-10.7B/blob/main/results_2024-03-30T19-04-44.877346.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6256384530384593,
"acc_stderr": 0.03222174705595155,
"acc_norm": 0.6357523261247408,
"acc_norm_stderr": 0.033105283857181055,
"mc1": 0.6964504283965728,
"mc1_stderr": 0.016095884155386847,
"mc2": 0.8032546914700353,
"mc2_stderr": 0.013304123687885194
},
"harness|arc:challenge|25": {
"acc": 0.7141638225255973,
"acc_stderr": 0.01320319608853737,
"acc_norm": 0.7431740614334471,
"acc_norm_stderr": 0.012766923794116798
},
"harness|hellaswag|10": {
"acc": 0.7484564827723561,
"acc_stderr": 0.004330134219762844,
"acc_norm": 0.8904600677155945,
"acc_norm_stderr": 0.003116771577319422
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.033550453048829226,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.033550453048829226
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642525,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.02482590979334334,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.02482590979334334
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.02423353229775873,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.02423353229775873
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.02424378399406217,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.02424378399406217
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551704,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551704
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.01659525971039932,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.01659525971039932
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.033888571185023246,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.033888571185023246
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801584,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801584
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8354430379746836,
"acc_stderr": 0.024135736240566932,
"acc_norm": 0.8354430379746836,
"acc_norm_stderr": 0.024135736240566932
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572206,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572206
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644561,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560392,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560392
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.0250093137900697,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.0250093137900697
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101026,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101026
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488533,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488533
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613513,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5071707953063885,
"acc_stderr": 0.012768922739553308,
"acc_norm": 0.5071707953063885,
"acc_norm_stderr": 0.012768922739553308
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681404,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681404
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139968,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139968
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6964504283965728,
"mc1_stderr": 0.016095884155386847,
"mc2": 0.8032546914700353,
"mc2_stderr": 0.013304123687885194
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498431
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SINAI/RefutES | ---
license: cc-by-nc-sa-4.0
language:
- es
tags:
- counter-narrative
- counterspeech
pretty_name: RefutES
---
### Dataset Description
**Paper**: Coming soon
**Point of Contact**: mevallec@ujaen.es
A new dataset has been created for RefutES shared task at IberLEF 2024. RefutES consist in the generation of counternarrative messages to combat hate-speech. We are going to release the corpus CONAN-MT-SP, which consists of HS-CN pairs covering 8 different hate targets (disabled, Jews, LGBT+, migrants, Muslims, people of colour, women and other groups).
To build CONAN-MT-SP, we use the hate speech of the English MultiTarget CONAN (CONAN-MT) corpus (Fanton et al. 2021) that collected its HS-CN pairs by niche sourcing from two different NGOs and subsequently used these pairs to generate more HS-CN with GPT-4 with human review integrated into the process. Due to the fact that the hate speech message is in English in CONAN-MT, we translate it into Spanish using the DeepL API. All translations were reviewed by our annotators, and in those pairs where the translations were erroneous, they were edited. The associated counternarrative (CN) to each hate-speech message (HS) is generated by the GPT-4 model using a prompt strategy. The strategy used consisted in a Few Shot Learning Strategy, where the model was prompted with a task description and 8 examples of HS-CN pairs (one for each target). In addition, the counternarrative generated by GPT-4 has been evaluated by human experts using different metrics:
- Offensiveness:
- 0 (not sure)
- 1 (not offensive)
- 2 (maybe offensive)
- 3 (completely offensive)
- Stance:
- 0 (irrelevant)
- 1 (strongly agree)
- 2 (slightly agree/disagree)
- 3 (strongly disagree)
- Informativeness:
- 0 (irrelevant)
- 1 (not informative)
- 2 (generic and uninformative statement)
- 3 (specific and informative)
- Truthfulness:
- 0 (not sure)
- 1 (not true)
- 2 (partially true)
- 3 (completely true)
- Editing required:
- 0 (no editing)
- 1 (yes editing)
- Comparison between H-M:
- 0 (both CN are equally valid)
- 1 (human generates a better CN)
- 2 (machine generates a better CN)
- 3 (neither CN is good)
In RefutES, we selected from this corpus the βperfectβ counter-narratives, i.e., those that are non-offensive, in complete disagreement, specific and informative, compellingly truthful, do not need editing, and are equal to or better than the initial CONAN-MT counter-narrative. The corpus is divided into three subsets, each related to a different part of the competition:
- **Train split:** contains 2496 HS-CN pairs.
- **Dev split:** contains 279 HS-CN pairs.
- **Test split:** contains 156 pairs HS-CN. 78 HS-CN pairs are generated by GPT-4 and manually annotated by humans and the others 78 HS-CN pairs generated by humans.
The refutES corpus is composed by the followig features that are the columns in the provided CSVs:
- **id:** contains an string that represent the identification of the HS-CN pair.
- **Hate-speech:** contains the hate speech message.
- **Reference-counternarrative:** contains the counternarrative associated to the hate-speech message that is generated by GPT-4.
- **Target:** contains the collective affected by the hate message. It can be disabled, Jews, LGBT+, migrants, Muslims, people of colour, women and other groups.
### Licensing Information
RefutES is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0).
### Citation Information
```bibtex
``` |
liuyanchen1015/MULTI_VALUE_stsb_reflex_number | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1988
num_examples: 10
- name: test
num_bytes: 164
num_examples: 1
- name: train
num_bytes: 1732
num_examples: 8
download_size: 11727
dataset_size: 3884
---
# Dataset Card for "MULTI_VALUE_stsb_reflex_number"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chargoddard/summarize_from_feedback_alpaca | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 138986664
num_examples: 92858
download_size: 16466576
dataset_size: 138986664
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "summarize_from_feedback_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-muse256-muse512-wuerst-sdv15/11cb7618 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 219
num_examples: 10
download_size: 1429
dataset_size: 219
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "11cb7618"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuanmei424/xxt_ds | ---
dataset_info:
features:
- name: edit_prompt
dtype: string
- name: input_image
dtype: image
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 5219118955.25
num_examples: 2283951
download_size: 0
dataset_size: 5219118955.25
---
# Dataset Card for "xxt_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
giux78/100k-sft-ready-ultrafeedback-ita | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 736148767
num_examples: 100000
- name: test_sft
num_bytes: 73258856
num_examples: 10000
- name: train_gen
num_bytes: 1347396812
num_examples: 256032
- name: test_gen
num_bytes: 148276089
num_examples: 28304
download_size: 1238466176
dataset_size: 2305080524
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
---
|
steviebarot/ph_er_dataset_binary | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': unwell
'1': well
splits:
- name: train
num_bytes: 12741962.0
num_examples: 14
- name: test
num_bytes: 11311022.0
num_examples: 12
download_size: 23923824
dataset_size: 24052984.0
---
# Dataset Card for "ph_er_dataset_binary"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_second_sent_train_400_eval_40_recite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 1369335
num_examples: 840
- name: validation
num_bytes: 71727
num_examples: 40
download_size: 536461
dataset_size: 1441062
---
# Dataset Card for "find_second_sent_train_400_eval_40_recite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cmagganas/three_shot_comparison | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: rationale
dtype: string
- name: task
dtype: string
- name: type
dtype: string
- name: decilm_generation
dtype: string
- name: mistral_generation
dtype: string
- name: mpt_generation
dtype: string
splits:
- name: train
num_bytes: 90718
num_examples: 30
download_size: 67115
dataset_size: 90718
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JasiekKaczmarczyk/maestro-sustain-quantized | ---
dataset_info:
features:
- name: midi_filename
dtype: string
- name: pitch
sequence: int16
length: 128
- name: dstart
sequence: float32
length: 128
- name: duration
sequence: float32
length: 128
- name: velocity
sequence: int16
length: 128
- name: dstart_bin
sequence: int8
length: 128
- name: duration_bin
sequence: int8
length: 128
- name: velocity_bin
sequence: int8
length: 128
splits:
- name: train
num_bytes: 89689142
num_examples: 43727
- name: validation
num_bytes: 10114654
num_examples: 4929
- name: test
num_bytes: 11695068
num_examples: 5695
download_size: 0
dataset_size: 111498864
---
# Dataset Card for "maestro-sustain-quantized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Capsekai/hogans-heroes | ---
license: creativeml-openrail-m
task_categories:
- text-classification
language:
- en
tags:
- art
pretty_name: Hogan's Heroes by Capsekai
size_categories:
- 1K<n<10K
---
# Dataset Card for Hogans Heroes TV Caps
<!-- Provide a quick summary of the dataset. -->
This dataset is generally caps from Hogans Heroes. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
These are curated screencaps of episodes from the 1960s tv show HOGANS HEROES.
Hand picked content from online sources, and capped using VLC's scene filter.
- **Curated by:** [https://capsekai.tumblr.com/]
## Uses
Research around text classification and preservation of old media.
### Direct Use
Study of character basis, research around the artistic nature of the episode's set design.
### Out-of-Scope Use
Going against local laws and regulations, onselling the dataset.
## Dataset Creation
### Curation Rationale
Preservation of old media.
### Source Data
Youtube & Dvd Sources
#### Data Collection and Processing
Collection: Unfiltered DVD / Youtube Caps.
#### Personal and Sensitive Information
There should be 0 personal info in here.
## Bias, Risks, and Limitations
????? - OH! Bias/Risks: Warning that this is a show that is largely based around World War 2. Like Dad's army this could contain sensitive topics and images.
The jokes set within the Reccomendations are just that we feel this TV show and the caps within are fairly safe, but it IS understandable if people largely have trigger issues with WW2.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
Risks? Copyright, DMCA, blinding adoration towards Bob Crane? UWU KLINK?
Soical bias: WW2 media.
## Dataset Card Authors
[https://capsekai.tumblr.com/]
## Dataset Card Contact
[https://capsekai.tumblr.com/] |
imvladikon/english_news_weak_ner | ---
language:
- en
size_categories:
- 1M<n<10M
task_categories:
- token-classification
dataset_info:
- config_name: articles
features:
- name: title
dtype: string
- name: author
dtype: string
- name: datetime
dtype: string
- name: url
dtype: string
- name: month
dtype: string
- name: day
dtype: string
- name: doc_id
dtype: string
- name: text
dtype: string
- name: year
dtype: string
- name: doc_title
dtype: string
splits:
- name: train
num_bytes: 1313871812
num_examples: 446809
download_size: 791316510
dataset_size: 1313871812
- config_name: entities
features:
- name: doc_id
dtype: string
- name: sent_num
dtype: int32
- name: sentence
dtype: string
- name: doc_title
dtype: string
- name: score
sequence: float32
- name: entity_type
sequence: string
- name: entity_text
sequence: string
- name: start_char
sequence: int32
- name: end_char
sequence: int32
- name: tokens
sequence: string
- name: raw_tags
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-DATE
'1': I-DATE
'2': L-DATE
'3': U-DATE
'4': B-DUC
'5': I-DUC
'6': L-DUC
'7': U-DUC
'8': B-EVE
'9': I-EVE
'10': L-EVE
'11': U-EVE
'12': B-LOC
'13': I-LOC
'14': L-LOC
'15': U-LOC
'16': B-MISC
'17': I-MISC
'18': L-MISC
'19': U-MISC
'20': B-ORG
'21': I-ORG
'22': L-ORG
'23': U-ORG
'24': B-PER
'25': I-PER
'26': L-PER
'27': U-PER
'28': B-QTY
'29': I-QTY
'30': L-QTY
'31': U-QTY
'32': B-TTL
'33': I-TTL
'34': L-TTL
'35': U-TTL
'36': O
splits:
- name: train
num_bytes: 3665237140
num_examples: 3515149
download_size: 966462235
dataset_size: 3665237140
configs:
- config_name: articles
data_files:
- split: train
path: articles/train-*
- config_name: entities
data_files:
- split: train
path: entities/train-*
---
# Large Weak Labelled NER corpus
### Dataset Summary
The dataset is generated through weak labelling of the scraped and preprocessed news corpus (bloomberg's news). so, only to research purpose.
In order of the tokenization, news were splitted into sentences using `nltk.PunktSentenceTokenizer` (so, sometimes, tokenization might be not perfect)
### Usage
```python
from datasets import load_dataset
articles_ds = load_dataset("imvladikon/english_news_weak_ner", "articles") # just articles with metadata
entities_ds = load_dataset("imvladikon/english_news_weak_ner", "entities")
```
#### NER tags
Tags description:
* O Outside of a named entity
* PER Person
* LOC Location
* ORG Organization
* MISC Miscellaneous
* DATE Date and time expression
* QTY Quantity
* EVE Event
* TTL Title
* DUC Commercial item
Tags:
```json
['B-DATE', 'I-DATE', 'L-DATE', 'U-DATE', 'B-DUC', 'I-DUC', 'L-DUC', 'U-DUC', 'B-EVE', 'I-EVE', 'L-EVE', 'U-EVE', 'B-LOC', 'I-LOC', 'L-LOC', 'U-LOC', 'B-MISC', 'I-MISC', 'L-MISC', 'U-MISC', 'B-ORG', 'I-ORG', 'L-ORG', 'U-ORG', 'B-PER', 'I-PER', 'L-PER', 'U-PER', 'B-QTY', 'I-QTY', 'L-QTY', 'U-QTY', 'B-TTL', 'I-TTL', 'L-TTL', 'U-TTL', 'O']
```
Tags statistics:
```json
{
"O": 281586813,
"B-QTY": 2675754,
"L-QTY": 2675754,
"I-QTY": 2076724,
"U-ORG": 1459628,
"I-ORG": 1407875,
"B-ORG": 1318711,
"L-ORG": 1318711,
"B-PER": 1254037,
"L-PER": 1254037,
"U-MISC": 1195204,
"U-LOC": 1084052,
"U-DATE": 1010118,
"B-DATE": 919815,
"L-DATE": 919815,
"I-DATE": 650064,
"U-PER": 607212,
"U-QTY": 559523,
"B-LOC": 425431,
"L-LOC": 425431,
"I-PER": 262887,
"I-LOC": 201532,
"I-MISC": 190576,
"B-MISC": 162978,
"L-MISC": 162978,
"I-TTL": 64641,
"B-TTL": 53330,
"L-TTL": 53330,
"B-EVE": 43329,
"L-EVE": 43329,
"U-TTL": 41568,
"I-EVE": 35316,
"U-DUC": 33457,
"U-EVE": 19103,
"I-DUC": 15622,
"B-DUC": 15580,
"L-DUC": 15580
}
```
#### Sample:

Articles:
```json
{'title': 'Watson Reports Positive Findings for Prostate Drug',
'author': 'RobertSimison',
'datetime': '2007-01-16T14:16:56Z',
'url': 'http://www.bloomberg.com/news/2007-01-16/watson-reports-positive-findings-for-prostate-drug-update1-.html',
'month': '1',
'day': '16',
'doc_id': 'a5c7c556bd112ac22874492c4cdb18eb46a30905',
'text': 'Watson Pharmaceuticals Inc. (WPI) , the\nlargest U.S. maker of generic drugs, reported positive results\nfor its experimental prostate treatment in two late-state trials. \n The drug, silodosin, was more effective than a placebo in\ntreating enlarged prostates, or benign prostatic hyperplasia, the\nCorona, California-based company said today in a statement on PR\nNewswire. The tests were in the final of three phases of trials\nnormally needed for regulatory approval. \n Non-cancerous enlarged prostate affects more than half of\nAmerican men in their 60s and as many as 90 percent of them by\nage 85, Watson said. Prescription drug sales to treat the\ndisorder total $1.7 billion a year, the company said. \n Watson plans to apply for U.S. approval to market the drug\nin the first half of 2008, after completion later this year of a\none-year safety trial, the company said. The two studies reported\ntoday showed that cardiovascular and blood-pressure side effects\nwere low, Watson said. \n To contact the reporter on this story:\nRobert Simison in Washington at \n rsimison@bloomberg.net . \n To contact the editor responsible for this story:\nRobert Simison at rsimison@bloomberg.net .',
'year': '2007',
'doc_title': 'watson-reports-positive-findings-for-prostate-drug-update1-'}
```
Entities:
```json
{'doc_id': '806fe637ed51e03d9ef7a8889fc84f63f8fc8569',
'sent_num': 9,
'sentence': 'Spain and Portugal together accounted for 45\npercent of group profit in 2010.',
'doc_title': 'bbva-may-post-lower-first-quarter-profit-hurt-by-spain-decline',
'spans': {'Score': [0.7858654856681824,
0.7856822609901428,
0.9990736246109009,
0.999079704284668],
'Type': ['ORGANIZATION', 'ORGANIZATION', 'QUANTITY', 'DATE'],
'Text': ['Spain', 'Portugal', '45\npercent', '2010'],
'BeginOffset': [0, 10, 42, 72],
'EndOffset': [5, 18, 52, 76]},
'tags': {'tokens': ['Spain',
'Spain',
'and',
'Portugal',
'Spain',
'and',
'Portugal',
'together',
'accounted',
'for',
'45',
'\n',
'percent',
'Spain',
'and',
'Portugal',
'together',
'accounted',
'for',
'45',
'\n',
'percent',
'of',
'group',
'profit',
'in',
'2010',
'.'],
'raw_tags': ['U-ORG',
'O',
'O',
'U-ORG',
'O',
'O',
'O',
'O',
'O',
'O',
'B-QTY',
'I-QTY',
'L-QTY',
'O',
'O',
'O',
'O',
'O',
'O',
'O',
'O',
'O',
'O',
'O',
'O',
'O',
'U-DATE',
'O'],
'ner_tags': [23,
36,
36,
23,
36,
36,
36,
36,
36,
36,
28,
29,
30,
36,
36,
36,
36,
36,
36,
36,
36,
36,
36,
36,
36,
36,
3,
36]}}
```
### Data splits
| name |train|
|---------|----:|
|entities|3515149|
|articles|446809|
### Citation Information
```
@misc{imvladikon2023bb_news_weak_ner,
author = {Gurevich, Vladimir},
title = {Weakly Labelled Large English NER corpus},
year = {2022},
howpublished = \url{https://huggingface.co/datasets/imvladikon/english_news_weak_ner},
}
```
|
mahdibaghbanzadeh/BERTax_non_similar_dataset_phylum | ---
dataset_info:
features:
- name: sequence
dtype: string
- name: phylum
dtype:
class_label:
names:
'0': Actinomycetota
'1': Apicomplexa
'2': Arthropoda
'3': Artverviricota
'4': Ascomycota
'5': Bacillariophyta
'6': Bacillota
'7': Bacteroidota
'8': Basidiomycota
'9': Bdellovibrionota
'10': Campylobacterota
'11': Candidatus Thermoplasmatota
'12': Chloroflexota
'13': Chordata
'14': Cyanobacteriota
'15': Deinococcota
'16': Euryarchaeota
'17': Kitrinoviricota
'18': Mollusca
'19': Mycoplasmatota
'20': Myxococcota
'21': Negarnaviricota
'22': Nitrososphaerota
'23': Peploviricota
'24': Pisuviricota
'25': Planctomycetota
'26': Pseudomonadota
'27': Rhodothermota
'28': Spirochaetota
'29': Streptophyta
'30': Thermodesulfobacteriota
'31': Thermodesulfobiota
'32': Thermoproteota
'33': Thermotogota
'34': Uroviricota
'35': Verrucomicrobiota
splits:
- name: train
num_bytes: 3386883024
num_examples: 2240002
- name: test
num_bytes: 80740800
num_examples: 53400
download_size: 1704006951
dataset_size: 3467623824
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Alex123321/english_cefr_dataset | ---
license: apache-2.0
---
|
jtatman/headlines | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 80263469
num_examples: 1662297
download_size: 62717748
dataset_size: 80263469
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "headlines"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jmukesh99/AIBE-testing | ---
license: apache-2.0
---
|
saibo/bookcorpus_compact_512_test | ---
dataset_info:
features:
- name: text
dtype: string
- name: concept_with_offset
dtype: string
splits:
- name: train
num_bytes: 39735149
num_examples: 6160
download_size: 20545672
dataset_size: 39735149
---
# Dataset Card for "bookcorpus_compact_512_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sheza/Human-Values | ---
task_categories:
- text-classification
pipeline_tag: text-classification
widget:
- text: "we are exploiting the youth purely for entertainment."
- text: "human cloning could aid medical advances and should therefore be allowed."
- text: "people need to grow up and realise the world is a hard place"
language:
- en
---
### Labels
|label|meaning|
|:---|:-----------|
|achievement_P | in favor of achievement |
|achievement_N | against achievement |
|power_dominance_P | in favor of power: dominance |
|power_dominance_N | against power: dominance |
|power_resources_P | in favor of power: resources |
|power_resources_N | against power: resources | |
roclive/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
dtype: int64
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
splits:
- name: train
num_bytes: 290441
num_examples: 100
download_size: 170269
dataset_size: 290441
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_hellaswag_en_w3 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149508.65384615384
num_examples: 250
download_size: 82715
dataset_size: 149508.65384615384
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_w3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thiomajid/java_methods_renamed | ---
dataset_info:
features:
- name: commit_sha
dtype: string
- name: new_methods
list:
- name: arguments
sequence: string
- name: filename
dtype: string
- name: implementation
dtype: string
- name: signature
dtype: string
- name: old_methods
list:
- name: arguments
sequence: string
- name: filename
dtype: string
- name: implementation
dtype: string
- name: signature
dtype: string
splits:
- name: train
num_bytes: 794271
num_examples: 74
download_size: 271079
dataset_size: 794271
---
# Dataset Card for "java_renaming_patch"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hemakumari/g_name | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Male
'1': Female
splits:
- name: train
num_bytes: 973672.3668630284
num_examples: 48583
- name: test
num_bytes: 108203.63313697158
num_examples: 5399
download_size: 570160
dataset_size: 1081876.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
result-muse256-muse512-wuerst-sdv15/18cadc88 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 191
num_examples: 10
download_size: 1352
dataset_size: 191
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "18cadc88"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dstycoon/distilabel-medical-instructions | ---
dataset_info:
features:
- name: instructions
dtype: string
splits:
- name: train
num_bytes: 13239
num_examples: 160
download_size: 6115
dataset_size: 13239
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fathyshalab/massive_email-de | ---
dataset_info:
features:
- name: id
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 61477
num_examples: 953
- name: validation
num_bytes: 10136
num_examples: 157
- name: test
num_bytes: 17478
num_examples: 271
download_size: 46681
dataset_size: 89091
---
# Dataset Card for "massive_email-de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ShapeNet/shapenetcore-glb | ---
language:
- en
pretty_name: ShapeNetCore
tags:
- 3D shapes
license: other
extra_gated_heading: Acknowledge license to accept the repository
extra_gated_prompt: >-
To request access to this ShapeNet repo, you will need to provide your **full name** (please provide both your first and last name), the name of your **advisor or the principal investigator (PI)** of your lab (in the PI/Advisor) fields, and the **school or company** that you are affiliated with (the **Affiliation** field).
After requesting access to this ShapeNet repo, you will be considered for access approval.
After access approval, you (the "Researcher") receive permission to use the ShapeNet database (the "Database") at Princeton University and Stanford University. In exchange for being able to join the ShapeNet community and receive such permission, Researcher hereby agrees to the following terms and conditions:
Researcher shall use the Database only for non-commercial research and educational purposes.
Princeton University and Stanford University make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.
Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify Princeton University and Stanford University, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted 3D models that he or she may create from the Database.
Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions.
Princeton University and Stanford University reserve the right to terminate Researcher's access to the Database at any time.
If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.
The law of the State of New Jersey shall apply to all disputes under this agreement.
For access to the data, please fill in your **full name** (both first and last name), the name of your **advisor or principal investigator (PI)**, and the name of the **school or company** you are affliated with.
Please actually fill out the fields (DO NOT put the word "Advisor" for PI/Advisor and the word "School" for "Affiliation", please specify the name of your advisor and the name of your school).
extra_gated_fields:
Name: text
PI/Advisor: text
Affiliation: text
Purpose: text
Country: text
I agree to use this dataset for non-commercial use ONLY: checkbox
---
This repository contains ShapeNetCore (v2) in [GLB](https://en.wikipedia.org/wiki/GlTF#GLB) format, a subset of [ShapeNet](https://shapenet.org).
ShapeNetCore is a densely annotated subset of ShapeNet covering 55 common object categories with ~51,300 unique 3D models. Each model in ShapeNetCore are linked to an appropriate synset in [WordNet 3.0](https://wordnet.princeton.edu/).
If you use ShapeNet data, you agree to abide by the [ShapeNet terms of use](https://shapenet.org/terms). You are only allowed to redistribute the data to your research associates and colleagues provided that they first agree to be bound by these terms and conditions.
If you use this data, please cite the main ShapeNet technical report.
```
@techreport{shapenet2015,
title = {{ShapeNet: An Information-Rich 3D Model Repository}},
author = {Chang, Angel X. and Funkhouser, Thomas and Guibas, Leonidas and Hanrahan, Pat and Huang, Qixing and Li, Zimo and Savarese, Silvio and Savva, Manolis and Song, Shuran and Su, Hao and Xiao, Jianxiong and Yi, Li and Yu, Fisher},
number = {arXiv:1512.03012 [cs.GR]},
institution = {Stanford University --- Princeton University --- Toyota Technological Institute at Chicago},
year = {2015}
}
```
For more information, please contact us at shapenetwebmaster@gmail.com and indicate ShapeNetCore v2 in the title of your email.
|
Chuckbets47/CarmE | ---
license: afl-3.0
---
|
owanr/r1_coedit | ---
dataset_info:
features:
- name: src
dtype: string
- name: tgt
dtype: string
- name: tag
dtype: string
splits:
- name: train
num_bytes: 20559166.0
num_examples: 71614
- name: val
num_bytes: 2376188.0
num_examples: 8950
- name: test
num_bytes: 2360064.0
num_examples: 8960
download_size: 10222246
dataset_size: 25295418.0
---
# Dataset Card for "r1_coedit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alarcon7a/somos-clean-alpaca-es-validations | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: 1-instruction
dtype: string
- name: 2-input
dtype: string
- name: 3-output
dtype: string
- name: prediction
dtype: 'null'
- name: prediction_agent
dtype: 'null'
- name: annotation
dtype: string
- name: annotation_agent
dtype: string
- name: vectors
struct:
- name: input
sequence: float64
- name: instruction
sequence: float64
- name: output
sequence: float64
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 739721
num_examples: 39
download_size: 0
dataset_size: 739721
---
# Dataset Card for "somos-clean-alpaca-es-validations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yyu/wiki_corpus | ---
license: mit
---
|
Revankumar/News_room | ---
license: mit
---
|
PhilSad/celeba-hq-1.5k | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': female
'1': male
splits:
- name: train
num_bytes: 146276286.0
num_examples: 1500
download_size: 146277189
dataset_size: 146276286.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "celeba-hq-1.5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SickBoy/prueba_dataset_layoutlm | ---
license: openrail
---
|
open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8B-Chat-llama | ---
pretty_name: Evaluation run of KnutJaegersberg/Qwen-1_8B-Chat-llama
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/Qwen-1_8B-Chat-llama](https://huggingface.co/KnutJaegersberg/Qwen-1_8B-Chat-llama)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8B-Chat-llama\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-20T02:47:07.832828](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8B-Chat-llama/blob/main/results_2024-01-20T02-47-07.832828.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4417307007712396,\n\
\ \"acc_stderr\": 0.03457643291788475,\n \"acc_norm\": 0.4458531507999814,\n\
\ \"acc_norm_stderr\": 0.03533462860998811,\n \"mc1\": 0.2741738066095471,\n\
\ \"mc1_stderr\": 0.015616518497219371,\n \"mc2\": 0.436959909496514,\n\
\ \"mc2_stderr\": 0.01509621411098862\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.34215017064846415,\n \"acc_stderr\": 0.01386415215917728,\n\
\ \"acc_norm\": 0.36945392491467577,\n \"acc_norm_stderr\": 0.014104578366491899\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.42959569806811393,\n\
\ \"acc_stderr\": 0.004940067402031043,\n \"acc_norm\": 0.5434176458872735,\n\
\ \"acc_norm_stderr\": 0.004970933420231931\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04244633238353229,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04244633238353229\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.04068590050224971,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.04068590050224971\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.45660377358490567,\n \"acc_stderr\": 0.030656748696739435,\n\
\ \"acc_norm\": 0.45660377358490567,\n \"acc_norm_stderr\": 0.030656748696739435\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835363,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835363\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.039325376803928704,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.039325376803928704\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n\
\ \"acc_stderr\": 0.028438677998909558,\n \"acc_norm\": 0.49032258064516127,\n\
\ \"acc_norm_stderr\": 0.028438677998909558\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998574,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.038154943086889305,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.038154943086889305\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5757575757575758,\n \"acc_stderr\": 0.03521224908841586,\n \"\
acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03521224908841586\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.538860103626943,\n \"acc_stderr\": 0.035975244117345775,\n\
\ \"acc_norm\": 0.538860103626943,\n \"acc_norm_stderr\": 0.035975244117345775\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36923076923076925,\n \"acc_stderr\": 0.024468615241478905,\n\
\ \"acc_norm\": 0.36923076923076925,\n \"acc_norm_stderr\": 0.024468615241478905\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478466,\n\
\ \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478466\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5357798165137615,\n \"acc_stderr\": 0.021382364775701893,\n \"\
acc_norm\": 0.5357798165137615,\n \"acc_norm_stderr\": 0.021382364775701893\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298825,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298825\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4852941176470588,\n \"acc_stderr\": 0.03507793834791324,\n \"\
acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03507793834791324\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6118143459915611,\n \"acc_stderr\": 0.031722950043323296,\n \
\ \"acc_norm\": 0.6118143459915611,\n \"acc_norm_stderr\": 0.031722950043323296\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.515695067264574,\n\
\ \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.515695067264574,\n\
\ \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.04374928560599738,\n\
\ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.04374928560599738\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5454545454545454,\n \"acc_stderr\": 0.045454545454545484,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.045454545454545484\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n\
\ \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977238,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977238\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.029343114798094472,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.029343114798094472\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5325670498084292,\n\
\ \"acc_stderr\": 0.017841995750520874,\n \"acc_norm\": 0.5325670498084292,\n\
\ \"acc_norm_stderr\": 0.017841995750520874\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.026915047355369818,\n\
\ \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.026915047355369818\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4662379421221865,\n\
\ \"acc_stderr\": 0.028333277109562793,\n \"acc_norm\": 0.4662379421221865,\n\
\ \"acc_norm_stderr\": 0.028333277109562793\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4104938271604938,\n \"acc_stderr\": 0.027371350925124768,\n\
\ \"acc_norm\": 0.4104938271604938,\n \"acc_norm_stderr\": 0.027371350925124768\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590947,\n \
\ \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590947\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3474576271186441,\n\
\ \"acc_stderr\": 0.012161417729749798,\n \"acc_norm\": 0.3474576271186441,\n\
\ \"acc_norm_stderr\": 0.012161417729749798\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.02972215209928007,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.02972215209928007\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4117647058823529,\n \"acc_stderr\": 0.019910377463105932,\n \
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.019910377463105932\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4816326530612245,\n \"acc_stderr\": 0.031987615467631264,\n\
\ \"acc_norm\": 0.4816326530612245,\n \"acc_norm_stderr\": 0.031987615467631264\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.572139303482587,\n\
\ \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.572139303482587,\n\
\ \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5029239766081871,\n \"acc_stderr\": 0.03834759370936839,\n\
\ \"acc_norm\": 0.5029239766081871,\n \"acc_norm_stderr\": 0.03834759370936839\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n\
\ \"mc1_stderr\": 0.015616518497219371,\n \"mc2\": 0.436959909496514,\n\
\ \"mc2_stderr\": 0.01509621411098862\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5887924230465666,\n \"acc_stderr\": 0.013829128358676878\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19257012888551933,\n \
\ \"acc_stderr\": 0.010861483868509925\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/Qwen-1_8B-Chat-llama
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|arc:challenge|25_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|gsm8k|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hellaswag|10_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T02-47-07.832828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T02-47-07.832828.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- '**/details_harness|winogrande|5_2024-01-20T02-47-07.832828.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-20T02-47-07.832828.parquet'
- config_name: results
data_files:
- split: 2024_01_20T02_47_07.832828
path:
- results_2024-01-20T02-47-07.832828.parquet
- split: latest
path:
- results_2024-01-20T02-47-07.832828.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/Qwen-1_8B-Chat-llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Qwen-1_8B-Chat-llama](https://huggingface.co/KnutJaegersberg/Qwen-1_8B-Chat-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8B-Chat-llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T02:47:07.832828](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Qwen-1_8B-Chat-llama/blob/main/results_2024-01-20T02-47-07.832828.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4417307007712396,
"acc_stderr": 0.03457643291788475,
"acc_norm": 0.4458531507999814,
"acc_norm_stderr": 0.03533462860998811,
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219371,
"mc2": 0.436959909496514,
"mc2_stderr": 0.01509621411098862
},
"harness|arc:challenge|25": {
"acc": 0.34215017064846415,
"acc_stderr": 0.01386415215917728,
"acc_norm": 0.36945392491467577,
"acc_norm_stderr": 0.014104578366491899
},
"harness|hellaswag|10": {
"acc": 0.42959569806811393,
"acc_stderr": 0.004940067402031043,
"acc_norm": 0.5434176458872735,
"acc_norm_stderr": 0.004970933420231931
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353229,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353229
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.04068590050224971,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.04068590050224971
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.45660377358490567,
"acc_stderr": 0.030656748696739435,
"acc_norm": 0.45660377358490567,
"acc_norm_stderr": 0.030656748696739435
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835363,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835363
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928704,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928704
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.028438677998909558,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.028438677998909558
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998574,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.038154943086889305,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.038154943086889305
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03521224908841586,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03521224908841586
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.538860103626943,
"acc_stderr": 0.035975244117345775,
"acc_norm": 0.538860103626943,
"acc_norm_stderr": 0.035975244117345775
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36923076923076925,
"acc_stderr": 0.024468615241478905,
"acc_norm": 0.36923076923076925,
"acc_norm_stderr": 0.024468615241478905
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871927,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478466,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5357798165137615,
"acc_stderr": 0.021382364775701893,
"acc_norm": 0.5357798165137615,
"acc_norm_stderr": 0.021382364775701893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298825,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298825
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03507793834791324,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03507793834791324
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6118143459915611,
"acc_stderr": 0.031722950043323296,
"acc_norm": 0.6118143459915611,
"acc_norm_stderr": 0.031722950043323296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.515695067264574,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.515695067264574,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.045454545454545484,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.045454545454545484
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4785276073619632,
"acc_stderr": 0.0392474687675113,
"acc_norm": 0.4785276073619632,
"acc_norm_stderr": 0.0392474687675113
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977238,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977238
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.029343114798094472,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.029343114798094472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5325670498084292,
"acc_stderr": 0.017841995750520874,
"acc_norm": 0.5325670498084292,
"acc_norm_stderr": 0.017841995750520874
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.026915047355369818,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.026915047355369818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4662379421221865,
"acc_stderr": 0.028333277109562793,
"acc_norm": 0.4662379421221865,
"acc_norm_stderr": 0.028333277109562793
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4104938271604938,
"acc_stderr": 0.027371350925124768,
"acc_norm": 0.4104938271604938,
"acc_norm_stderr": 0.027371350925124768
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.028406627809590947,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.028406627809590947
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3474576271186441,
"acc_stderr": 0.012161417729749798,
"acc_norm": 0.3474576271186441,
"acc_norm_stderr": 0.012161417729749798
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.02972215209928007,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.02972215209928007
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.019910377463105932,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.019910377463105932
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4816326530612245,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.4816326530612245,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.572139303482587,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.572139303482587,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5029239766081871,
"acc_stderr": 0.03834759370936839,
"acc_norm": 0.5029239766081871,
"acc_norm_stderr": 0.03834759370936839
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219371,
"mc2": 0.436959909496514,
"mc2_stderr": 0.01509621411098862
},
"harness|winogrande|5": {
"acc": 0.5887924230465666,
"acc_stderr": 0.013829128358676878
},
"harness|gsm8k|5": {
"acc": 0.19257012888551933,
"acc_stderr": 0.010861483868509925
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_stsb_medial_object_perfect | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 6461
num_examples: 31
- name: test
num_bytes: 5303
num_examples: 25
- name: train
num_bytes: 21565
num_examples: 82
download_size: 33167
dataset_size: 33329
---
# Dataset Card for "MULTI_VALUE_stsb_medial_object_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python3-standardized_cluster_12 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 109821390
num_examples: 9814
download_size: 0
dataset_size: 109821390
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_12"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
najju/sign-psl-13b_d1 | ---
dataset_info:
features:
- name: Text
dtype: string
- name: Gloss
dtype: string
splits:
- name: train
num_bytes: 256552
num_examples: 4014
download_size: 158938
dataset_size: 256552
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nblinh63/twitter_dataset_1712689800 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 80461
num_examples: 201
download_size: 38420
dataset_size: 80461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Prajapat/fittess_test | ---
dataset_info:
features:
- name: Human
dtype: string
- name: Assistant
dtype: string
splits:
- name: train
num_bytes: 61636.0
num_examples: 186
download_size: 29313
dataset_size: 61636.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Macropodus/MWP-Instruct | ---
license: apache-2.0
---
|
breno30/Barreto | ---
license: openrail
---
|
chuyin0321/short-interest-stocks | ---
dataset_info:
features:
- name: symbol
dtype: string
- name: date
dtype: string
- name: id
dtype: int64
- name: settlement_date
dtype: timestamp[ns]
- name: interest
dtype: float64
- name: avg_daily_share_volume
dtype: float64
- name: days_to_cover
dtype: float64
splits:
- name: train
num_bytes: 1100954
num_examples: 17724
download_size: 437394
dataset_size: 1100954
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "short-interest-stocks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BAAI/OPI | ---
extra_gated_heading: Acknowledge license to accept the repository
extra_gated_prompt: >
The Beijing Academy of Artificial Intelligence (hereinafter referred to as
"we" or "BAAI") provides you with an open-source dataset (hereinafter referred
to as "dataset") through the OPI HuggingFace repository
(https://huggingface.co/datasets/BAAI/OPI). You can download the dataset you
need and use it for purposes such as learning and research while abiding by
the usage rules of each original dataset.
Before you acquire the open-source dataset (including but not limited to
accessing, downloading, copying, distributing, using, or any other handling of
the dataset), you should read and understand this "OPI Open-Source Dataset
Usage Notice and Disclaimer" (hereinafter referred to as "this statement").
Once you acquire the open-source dataset, regardless of your method of
acquisition, your actions will be regarded as acknowledgment of the full
content of this statement.
1. Ownership and Operation Rights
You should fully understand that the ownership and operation rights of the OPI
HuggingFace repository (including the current and all previous versions)
belong to BAAI. BAAI has the final interpretation and decision rights over
this platform/tool and the open-source dataset plan.
You acknowledge and understand that due to updates and improvements in
relevant laws and regulations and the need to fulfill our legal compliance
obligations, we reserve the right to update, maintain, or even suspend or
permanently terminate the services of this platform/tool from time to time. We
will notify you of possible situations mentioned above reasonably such as
through an announcement or email within a reasonable time. You should make
corresponding adjustments and arrangements in a timely manner. However, we do
not bear any responsibility for any losses caused to you by any of the
aforementioned situations.
2. Claim of Rights to Open-Source Datasets
For the purpose of facilitating your dataset acquisition and use for learning,
and research, we have performed necessary steps such as format integration,
data cleaning, labeling, categorizing, annotating, and other related
processing on the third-party original datasets to form the open-source
datasets for this platform/tool's users.
You understand and acknowledge that we do not claim the proprietary rights of
intellectual property to the open-source datasets. Therefore, we have no
obligation to actively recognize and protect the potential intellectual
property of the open-source datasets. However, this does not mean that we
renounce the personal rights to claim credit, publication, modification, and
protection of the integrity of the work (if any) of the open-source datasets.
The potential intellectual property and corresponding legal rights of the
original datasets belong to the original rights holders.
In addition, providing you with open-source datasets that have been reasonably
arranged, processed, and handled does not mean that we acknowledge the
authenticity, accuracy, or indisputability of the intellectual property and
information content of the original datasets. You should filter and carefully
discern the open-source datasets you choose to use. You understand and agree
that BAAI does not undertake any obligation or warranty responsibility for any
defects or flaws in the original datasets you choose to use.
3. Usage Restrictions for Open-Source Datasets
Your use of the dataset must not infringe on our or any third party's legal
rights and interests (including but not limited to copyrights, patent rights,
trademark rights, and other intellectual property and other rights).
After obtaining the open-source dataset, you should ensure that your use of
the open-source dataset does not exceed the usage rules explicitly stipulated
by the rights holders of the original dataset in the form of a public notice
or agreement, including the range, purpose, and lawful purposes of the use of
the original data. We kindly remind you here that if your use of the
open-source dataset exceeds the predetermined range and purpose of the
original dataset, you may face the risk of infringing on the legal rights and
interests of the rights holders of the original dataset, such as intellectual
property, and may bear corresponding legal responsibilities.
4. Personal Information Protection
Due to technical limitations and the public welfare nature of the open-source
datasets, we cannot guarantee that the open-source datasets do not contain any
personal information, and we do not bear any legal responsibility for any
personal information that may be involved in the open-source datasets.
If the open-source dataset involves personal information, we do not bear any
legal responsibility for any personal information processing activities you
may involve when using the open-source dataset. We kindly remind you here that
you should handle personal information in accordance with the provisions of
the "Personal Information Protection Law" and other relevant laws and
regulations.
To protect the legal rights and interests of the information subject and to
fulfill possible applicable laws and administrative regulations, if you find
content that involves or may involve personal information during the use of
the open-source dataset, you should immediately stop using the part of the
dataset that involves personal information and contact us as indicated in "6.
Complaints and Notices."
5. Information Content Management
We do not bear any legal responsibility for any illegal and bad information
that may be involved in the open-source dataset.
If you find that the open-source dataset involves or may involve any illegal
and bad information during your use, you should immediately stop using the
part of the dataset that involves illegal and bad information and contact us
in a timely manner as indicated in "6. Complaints and Notices."
6. Complaints and Notices
If you believe that the open-source dataset has infringed on your legal rights
and interests, you can contact us at 010-50955974, and we will handle your
claims and complaints in accordance with the law in a timely manner.
To handle your claims and complaints, we may need you to provide contact
information, infringement proof materials, and identity proof materials.
Please note that if you maliciously complain or make false statements, you
will bear all legal responsibilities caused thereby (including but not limited
to reasonable compensation costs).
7. Disclaimer
You understand and agree that due to the nature of the open-source dataset,
the dataset may contain data from different sources and contributors, and the
authenticity, accuracy, and objectivity of the data may vary, and we cannot
make any promises about the availability and reliability of any dataset.
In any case, we do not bear any legal responsibility for any risks such as
personal information infringement, illegal and bad information dissemination,
and intellectual property infringement that may exist in the open-source
dataset.
In any case, we do not bear any legal responsibility for any loss (including
but not limited to direct loss, indirect loss, and loss of potential benefits)
you suffer or is related to the open-source dataset.
8. Others
The open-source dataset is in a constant state of development and change. We
may update, adjust the range of the open-source dataset we provide, or
suspend, pause, or terminate the open-source dataset service due to business
development, third-party cooperation, changes in laws and regulations, and
other reasons.
extra_gated_fields:
Name: text
Affiliation: text
Country: text
I agree to accept the license: checkbox
extra_gated_button_content: Acknowledge license
license: cc-by-nc-4.0
language:
- en
tags:
- biology
- protein
- instruction dataset
- instruction tuning
pretty_name: Open Protein Instructions(OPI)
size_categories:
- 1M<n<10M
task_categories:
- text-generation
---

# Dataset Card for Open Protein Instructions (OPI)
## Dataset Update
The previous version of OPI dataset is based on the **release 2022_01** of UniProtKB/Swiss-Prot protein knowledgebase. At current, OPI is updated to contain the latest **release 2023_05**, which can be accessed via the dataset file [OPI_updated_160k.json](./OPI_DATA/OPI_updated_160k.json).
Reference:
- https://ftp.uniprot.org/pub/databases/uniprot/previous_releases/release-2022_01/knowledgebase/UniProtKB_SwissProt-relstat.html
- https://ftp.uniprot.org/pub/databases/uniprot/previous_releases/release-2023_05/knowledgebase/UniProtKB_SwissProt-relstat.html
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Open Protein Instructions(OPI) is the initial part of Open Biology Instructions(OBI) project, together with the subsequent Open Molecule Instructions(OMI), Open DNA Instructions(ODI), Open RNA Instructions(ORI) and Open Single-cell Instructions (OSCI). OBI is a project which aims to fully leverage the potential ability of Large Language Models(LLMs), especially the scientific LLMs like Galactica, to facilitate research in AI for Life Science community. While OBI is still in an early stage, we hope to provide a starting point for the community to bridge LLMs and biological domain knowledge.
## Dataset Structure
### Data Instances
```
instruction:
What is the EC classification of the input protein sequence based on its biological function?
input:
MGLVSSKKPDKEKPIKEKDKGQWSPLKVSAQDKDAPPLPPLVVFNHLTPPPPDEHLDEDKHFVVALYDYTAMNDRDLQMLKGEKLQVLKGTGDWWLARS
LVTGREGYVPSNFVARVESLEMERWFFRSQGRKEAERQLLAPINKAGSFLIRESETNKGAFSLSVKDVTTQGELIKHYKIRCLDEGGYYISPRITFPSL
QALVQHYSKKGDGLCQRLTLPCVRPAPQNPWAQDEWEIPRQSLRLVRKLGSGQFGEVWMGYYKNNMKVAIKTLKEGTMSPEAFLGEANVMKALQHERLV
RLYAVVTKEPIYIVTEYMARGCLLDFLKTDEGSRLSLPRLIDMSAQIAEGMAYIERMNSIHRDLRAANILVSEALCCKIADFGLARIIDSEYTAQEGAK
FPIKWTAPEAIHFGVFTIKADVWSFGVLLMEVVTYGRVPYPGMSNPEVIRNLERGYRMPRPDTCPPELYRGVIAECWRSRPEERPTFEFLQSVLEDFYT
ATERQYELQP
output:
2.7.10.2
```
### Data Splits
The OPI dataset folder structure is as follows:
```
./OPI_DATA/
βββ AP
β βββ Function
β β βββ test
β β β βββ CASPSimilarSeq_function_test.jsonl
β β β βββ IDFilterSeq_function_test.jsonl
β β β βββ UniProtSeq_function_test.jsonl
β β βββ train
β β βββ function_description_train.json
β β βββ function_description_train_0.01.json
β βββ GO
β β βββ test
β β β βββ CASPSimilarSeq_go_test.jsonl
β β β βββ IDFilterSeq_go_test.jsonl
β β β βββ UniProtSeq_go_test.jsonl
β β βββ train
β β βββ go_terms_train.json
β β βββ go_terms_train_0.01.json
β βββ Keywords
β βββ test
β β βββ CASPSimilarSeq_keywords_test.jsonl
β β βββ IDFilterSeq_keywords_test.jsonl
β β βββ UniProtSeq_keywords_test.jsonl
β βββ train
β βββ keywords_train.json
β βββ keywords_train_0.01.json
βββ KM
β βββ gSymbol2Cancer
β β βββ test
β β β βββ gene_symbol_to_cancer_test.jsonl
β β βββ train
β β βββ gene_symbol_to_cancer_train.json
β βββ gName2Cancer
β β βββ test
β β β βββ gene_name_to_cancer_test.jsonl
β β βββ train
β β βββ gene_name_to_cancer_train.json
β βββ gSymbol2Tissue
β βββ test
β β βββ gene_symbol_to_tissue_test.jsonl
β βββ train
β βββ gene_symbol_to_tissue_train.json
βββ SU
βββ EC_number
β βββ test
β β βββ CLEAN_EC_number_new_test.jsonl
β β βββ CLEAN_EC_number_price_test.jsonl
β βββ train
β βββ CLEAN_EC_number_train.json
βββ Fold_type-Remote
β βββ test
β β βββ Remote_test.jsonl
β βββ train
β βββ Remote_train.json
βββ Subcellular_location
βββ test
β βββ location_test.jsonl
βββ train
βββ location_train.json
```
## Dataset Creation
The OPI dataset is curated on our own by extracting key information from [Swiss-Prot](https://www.uniprot.org/uniprotkb?facets=reviewed%3Atrue&query=%2A) database. The detailed construction pipeline is depicted in the supplementary material of our manuscript which has been submitted to NeurIPS 2023 Datasets and Benchmarks. The following figure shows the general construction process.

## License
The dataset is licensed under a Creative Commons Attribution Non Commercial 4.0 License. The use of this dataset should also abide by the original [License & Disclaimer](https://www.uniprot.org/help/license) and [Privacy Notice](https://www.uniprot.org/help/privacy) of UniProt. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.