datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
mirfan899/uner-ner | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': DATE
'1': DESIGNATION
'2': LOCATION
'3': NUMBER
'4': O
'5': ORGANIZATION
'6': PERSON
'7': TIME
splits:
- name: train
num_bytes: 682695
num_examples: 1145
- name: validation
num_bytes: 302036
num_examples: 491
- name: test
num_bytes: 302036
num_examples: 491
download_size: 0
dataset_size: 1286767
---
# Dataset Card for "uner-ner"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_29 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 872830032
num_examples: 170076
download_size: 892997828
dataset_size: 872830032
---
# Dataset Card for "chunk_29"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python-code-instructions-18k-alpaca-standardized_cluster_3_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4218713
num_examples: 7053
download_size: 1835077
dataset_size: 4218713
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python-code-instructions-18k-alpaca-standardized_cluster_3_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/degenbrecher_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of degenbrecher/ι (Arknights)
This is the dataset of degenbrecher/ι (Arknights), containing 141 images and their tags.
The core tags of this character are `long_hair, animal_ears, blonde_hair, horns, goat_horns, goat_ears, hair_between_eyes, goat_girl, breasts, yellow_eyes, very_long_hair, large_breasts, long_bangs, animal_ear_fluff, brown_horns, sidelocks, asymmetrical_sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 141 | 264.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/degenbrecher_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 141 | 222.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/degenbrecher_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 356 | 422.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/degenbrecher_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/degenbrecher_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, black_gloves, black_necktie, collared_shirt, dress_shirt, holding_sword, long_sleeves, military_jacket, military_uniform, orange_shirt, buttons, looking_at_viewer, notched_lapels, solo, standing, wing_collar, cowboy_shot, armband, closed_mouth, medal, green_jacket, sheath, green_pants, simple_background, single_pauldron, black_jacket, crossed_bangs, floating_hair, medium_breasts, thigh_strap, white_background, hair_flowing_over, v-shaped_eyebrows, white_pupils |
| 1 | 5 |  |  |  |  |  | 1girl, armband, black_gloves, black_necktie, collared_shirt, dress_shirt, green_jacket, long_sleeves, medal, military_jacket, military_uniform, notched_lapels, orange_shirt, single_epaulette, single_pauldron, solo, upper_body, wing_collar, closed_mouth, hand_up, looking_at_viewer, simple_background, breast_pocket, buttons, crossed_bangs, looking_to_the_side, medium_breasts, adjusting_clothes, ahoge, black_background, brown_eyes, hair_flowing_over, hand_on_own_chest, white_background, white_pupils |
| 2 | 5 |  |  |  |  |  | 1girl, black_necktie, collared_shirt, dress_shirt, long_sleeves, looking_at_viewer, military_jacket, military_uniform, notched_lapels, simple_background, single_pauldron, solo, upper_body, closed_mouth, green_jacket, medal, orange_shirt, armband, white_background, white_pupils, wing_collar, ahoge, buttons, crossed_bangs, parted_lips, pocket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | black_necktie | collared_shirt | dress_shirt | holding_sword | long_sleeves | military_jacket | military_uniform | orange_shirt | buttons | looking_at_viewer | notched_lapels | solo | standing | wing_collar | cowboy_shot | armband | closed_mouth | medal | green_jacket | sheath | green_pants | simple_background | single_pauldron | black_jacket | crossed_bangs | floating_hair | medium_breasts | thigh_strap | white_background | hair_flowing_over | v-shaped_eyebrows | white_pupils | single_epaulette | upper_body | hand_up | breast_pocket | looking_to_the_side | adjusting_clothes | ahoge | black_background | brown_eyes | hand_on_own_chest | parted_lips | pocket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:----------------|:-----------------|:--------------|:----------------|:---------------|:------------------|:-------------------|:---------------|:----------|:--------------------|:-----------------|:-------|:-----------|:--------------|:--------------|:----------|:---------------|:--------|:---------------|:---------|:--------------|:--------------------|:------------------|:---------------|:----------------|:----------------|:-----------------|:--------------|:-------------------|:--------------------|:--------------------|:---------------|:-------------------|:-------------|:----------|:----------------|:----------------------|:--------------------|:--------|:-------------------|:-------------|:--------------------|:--------------|:---------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | X | X | X | | X | | X | X | X | X | | | X | X | | X | | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | X | | X | X | X | X | X | X | X | X | | X | | X | X | X | X | | | X | X | | X | | | | X | | | X | | X | | | | | X | | | | X | X |
|
Ediudo/tildo | ---
license: openrail
---
|
p1atdev/dart-tokenized-pretrain-20240219 | ---
dataset_info:
features:
- name: tag_text
dtype: string
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 4086741056
num_examples: 5293004
download_size: 1355055068
dataset_size: 4086741056
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
drag88/snitch_image_binary_with_proddesc | ---
dataset_info:
features:
- name: Price
dtype: float64
- name: Product Description
dtype: string
- name: Product ID
dtype: float64
- name: Product Name
dtype: string
- name: Store
dtype: string
- name: Tags
dtype: string
- name: Vendor
dtype: string
- name: Size
sequence: string
- name: Product Image Link
dtype: string
- name: image_bytes
dtype: binary
- name: enhanced_description
dtype: string
splits:
- name: train
num_bytes: 1524446448
num_examples: 8832
download_size: 1335673004
dataset_size: 1524446448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Qdrant/ColBERT-TREC-COVID | ---
dataset_info:
features:
- name: documents
sequence:
sequence: float16
splits:
- name: train
num_bytes: 8019022928
num_examples: 171332
download_size: 5775769873
dataset_size: 8019022928
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- feature-extraction
language:
- en
tags:
- medical
pretty_name: ColBERT TREC COVID
size_categories:
- 100K<n<1M
---
This dataset consists ColBERTv2.0 document vectors for the entire TREC-COVID dataset from BeIR. That 128 dimension per token, with 180 tokens for each of 171332 documents.
The dataset was created using A100-40GB sponsored by Qdrant. The code to create these vectors is here: https://colab.research.google.com/drive/1hEhyleSrBz_mPyQJnRc0MwBenDuX1ahY?usp=sharing
This dataset was created for indexing experiments by Qdrant. |
deepapaikar/KatzBot_QnA_Test | ---
license: apache-2.0
---
|
Jacksparrowvk/my | ---
license: mit
---
|
Nexdata/Indonesian_Conversational_Speech_Data_by_Telephone | ---
task_categories:
- automatic-speech-recognition
language:
- id
---
# Dataset Card for Nexdata/Indonesian_Conversational_Speech_Data_by_Telephone
## Description
The 89 Hours - Indonesian conversational speech data collected by Telephone involved 124 native speakers, developed with proper balance of gender ratio, Speakers would choose a few familiar topics out of the given list and start conversations to ensure dialogues' fluency and naturalness. The recording devices are various mobile phones. The audio format is 8kHz, 8bit, u-law pcm, and all the speech data was recorded in quiet indoor environments. All the speech audio was manually transcribed with text content, the start and end time of each effective sentence, and speaker identification.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1311?source=Huggingface
# Specifications
## Format
8kHz 8bit, u-law pcm, mono channel;
## Environment
quiet indoor environment, without echo;
## Recording content
dozens of topics are specified, and the speakers make dialogue under those topics while the recording is performed;
## Demographics
140 speakers totally, with 54% male and 46% female
## Annotation
annotating for the transcription text, speaker identification and gender
## Device
Android mobile phone, iPhone;
## Language
Indonesian;
## Application scenarios
speech recognition; voiceprint recognition;
## Accuracy rate
the word accuracy rate is not less than 98%
# Licensing Information
Commercial License |
open-llm-leaderboard/details_ParasiticRogue__Merged-RP-Stew-V2-34B | ---
pretty_name: Evaluation run of ParasiticRogue/Merged-RP-Stew-V2-34B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ParasiticRogue/Merged-RP-Stew-V2-34B](https://huggingface.co/ParasiticRogue/Merged-RP-Stew-V2-34B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ParasiticRogue__Merged-RP-Stew-V2-34B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T23:07:10.295080](https://huggingface.co/datasets/open-llm-leaderboard/details_ParasiticRogue__Merged-RP-Stew-V2-34B/blob/main/results_2024-04-15T23-07-10.295080.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7711704868828757,\n\
\ \"acc_stderr\": 0.0276721684770019,\n \"acc_norm\": 0.7758442981381146,\n\
\ \"acc_norm_stderr\": 0.028183094757783765,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5792788550440546,\n\
\ \"mc2_stderr\": 0.015335521477635526\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175452,\n\
\ \"acc_norm\": 0.6706484641638225,\n \"acc_norm_stderr\": 0.013734057652635476\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6704839673371839,\n\
\ \"acc_stderr\": 0.004690768393854473,\n \"acc_norm\": 0.8605855407289384,\n\
\ \"acc_norm_stderr\": 0.0034567060380547555\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.725925925925926,\n\
\ \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.725925925925926,\n\
\ \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.9078947368421053,\n \"acc_stderr\": 0.02353268597044349,\n\
\ \"acc_norm\": 0.9078947368421053,\n \"acc_norm_stderr\": 0.02353268597044349\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8113207547169812,\n \"acc_stderr\": 0.024079995130062253,\n\
\ \"acc_norm\": 0.8113207547169812,\n \"acc_norm_stderr\": 0.024079995130062253\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9166666666666666,\n\
\ \"acc_stderr\": 0.023112508176051236,\n \"acc_norm\": 0.9166666666666666,\n\
\ \"acc_norm_stderr\": 0.023112508176051236\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n\
\ \"acc_stderr\": 0.03345036916788992,\n \"acc_norm\": 0.7398843930635838,\n\
\ \"acc_norm_stderr\": 0.03345036916788992\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.04951218252396262,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.04951218252396262\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n\
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.825531914893617,\n \"acc_stderr\": 0.024809442335503976,\n\
\ \"acc_norm\": 0.825531914893617,\n \"acc_norm_stderr\": 0.024809442335503976\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.631578947368421,\n\
\ \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.631578947368421,\n\
\ \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.033333333333333284,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.033333333333333284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7248677248677249,\n \"acc_stderr\": 0.02300008685906865,\n \"\
acc_norm\": 0.7248677248677249,\n \"acc_norm_stderr\": 0.02300008685906865\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9193548387096774,\n\
\ \"acc_stderr\": 0.015490002961591028,\n \"acc_norm\": 0.9193548387096774,\n\
\ \"acc_norm_stderr\": 0.015490002961591028\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6699507389162561,\n \"acc_stderr\": 0.03308530426228258,\n\
\ \"acc_norm\": 0.6699507389162561,\n \"acc_norm_stderr\": 0.03308530426228258\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\"\
: 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.025485498373343237,\n\
\ \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.025485498373343237\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9242424242424242,\n \"acc_stderr\": 0.018852670234993086,\n \"\
acc_norm\": 0.9242424242424242,\n \"acc_norm_stderr\": 0.018852670234993086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909039,\n\
\ \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909039\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8307692307692308,\n \"acc_stderr\": 0.01901100452365105,\n \
\ \"acc_norm\": 0.8307692307692308,\n \"acc_norm_stderr\": 0.01901100452365105\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.42592592592592593,\n \"acc_stderr\": 0.03014913560136595,\n \
\ \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03014913560136595\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.02273020811930654,\n \
\ \"acc_norm\": 0.8571428571428571,\n \"acc_norm_stderr\": 0.02273020811930654\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769572,\n \"\
acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769572\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065508,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065508\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8931297709923665,\n \"acc_stderr\": 0.027096548624883733,\n\
\ \"acc_norm\": 0.8931297709923665,\n \"acc_norm_stderr\": 0.027096548624883733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407388,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407388\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n\
\ \"acc_stderr\": 0.029239272675632748,\n \"acc_norm\": 0.8981481481481481,\n\
\ \"acc_norm_stderr\": 0.029239272675632748\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.02684576505455385,\n\
\ \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.02684576505455385\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6607142857142857,\n\
\ \"acc_stderr\": 0.044939490686135404,\n \"acc_norm\": 0.6607142857142857,\n\
\ \"acc_norm_stderr\": 0.044939490686135404\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n\
\ \"acc_stderr\": 0.016534627684311357,\n \"acc_norm\": 0.9316239316239316,\n\
\ \"acc_norm_stderr\": 0.016534627684311357\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9080459770114943,\n\
\ \"acc_stderr\": 0.010333225570778516,\n \"acc_norm\": 0.9080459770114943,\n\
\ \"acc_norm_stderr\": 0.010333225570778516\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8236994219653179,\n \"acc_stderr\": 0.020516425672490717,\n\
\ \"acc_norm\": 0.8236994219653179,\n \"acc_norm_stderr\": 0.020516425672490717\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7776536312849162,\n\
\ \"acc_stderr\": 0.013907189208156881,\n \"acc_norm\": 0.7776536312849162,\n\
\ \"acc_norm_stderr\": 0.013907189208156881\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.020464175124332618,\n\
\ \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.020464175124332618\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8263665594855305,\n\
\ \"acc_stderr\": 0.021514051585970397,\n \"acc_norm\": 0.8263665594855305,\n\
\ \"acc_norm_stderr\": 0.021514051585970397\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8765432098765432,\n \"acc_stderr\": 0.01830386880689179,\n\
\ \"acc_norm\": 0.8765432098765432,\n \"acc_norm_stderr\": 0.01830386880689179\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02812163604063989,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02812163604063989\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6088657105606258,\n\
\ \"acc_stderr\": 0.012463861839982058,\n \"acc_norm\": 0.6088657105606258,\n\
\ \"acc_norm_stderr\": 0.012463861839982058\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.022368672562886747,\n\
\ \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.022368672562886747\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8284313725490197,\n \"acc_stderr\": 0.01525199316349162,\n \
\ \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.01525199316349162\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8653061224489796,\n \"acc_stderr\": 0.021855658840811615,\n\
\ \"acc_norm\": 0.8653061224489796,\n \"acc_norm_stderr\": 0.021855658840811615\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9104477611940298,\n\
\ \"acc_stderr\": 0.020190670535027915,\n \"acc_norm\": 0.9104477611940298,\n\
\ \"acc_norm_stderr\": 0.020190670535027915\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5792788550440546,\n\
\ \"mc2_stderr\": 0.015335521477635526\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.010626964529971859\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6611068991660348,\n \
\ \"acc_stderr\": 0.01303795576856251\n }\n}\n```"
repo_url: https://huggingface.co/ParasiticRogue/Merged-RP-Stew-V2-34B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|arc:challenge|25_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|gsm8k|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hellaswag|10_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T23-07-10.295080.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T23-07-10.295080.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- '**/details_harness|winogrande|5_2024-04-15T23-07-10.295080.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T23-07-10.295080.parquet'
- config_name: results
data_files:
- split: 2024_04_15T23_07_10.295080
path:
- results_2024-04-15T23-07-10.295080.parquet
- split: latest
path:
- results_2024-04-15T23-07-10.295080.parquet
---
# Dataset Card for Evaluation run of ParasiticRogue/Merged-RP-Stew-V2-34B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ParasiticRogue/Merged-RP-Stew-V2-34B](https://huggingface.co/ParasiticRogue/Merged-RP-Stew-V2-34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ParasiticRogue__Merged-RP-Stew-V2-34B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T23:07:10.295080](https://huggingface.co/datasets/open-llm-leaderboard/details_ParasiticRogue__Merged-RP-Stew-V2-34B/blob/main/results_2024-04-15T23-07-10.295080.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7711704868828757,
"acc_stderr": 0.0276721684770019,
"acc_norm": 0.7758442981381146,
"acc_norm_stderr": 0.028183094757783765,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5792788550440546,
"mc2_stderr": 0.015335521477635526
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175452,
"acc_norm": 0.6706484641638225,
"acc_norm_stderr": 0.013734057652635476
},
"harness|hellaswag|10": {
"acc": 0.6704839673371839,
"acc_stderr": 0.004690768393854473,
"acc_norm": 0.8605855407289384,
"acc_norm_stderr": 0.0034567060380547555
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.725925925925926,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.725925925925926,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9078947368421053,
"acc_stderr": 0.02353268597044349,
"acc_norm": 0.9078947368421053,
"acc_norm_stderr": 0.02353268597044349
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8113207547169812,
"acc_stderr": 0.024079995130062253,
"acc_norm": 0.8113207547169812,
"acc_norm_stderr": 0.024079995130062253
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.023112508176051236,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.023112508176051236
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.03345036916788992,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.03345036916788992
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.825531914893617,
"acc_stderr": 0.024809442335503976,
"acc_norm": 0.825531914893617,
"acc_norm_stderr": 0.024809442335503976
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8,
"acc_stderr": 0.033333333333333284,
"acc_norm": 0.8,
"acc_norm_stderr": 0.033333333333333284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7248677248677249,
"acc_stderr": 0.02300008685906865,
"acc_norm": 0.7248677248677249,
"acc_norm_stderr": 0.02300008685906865
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9193548387096774,
"acc_stderr": 0.015490002961591028,
"acc_norm": 0.9193548387096774,
"acc_norm_stderr": 0.015490002961591028
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6699507389162561,
"acc_stderr": 0.03308530426228258,
"acc_norm": 0.6699507389162561,
"acc_norm_stderr": 0.03308530426228258
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.025485498373343237,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.025485498373343237
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.018852670234993086,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.018852670234993086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909039,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909039
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8307692307692308,
"acc_stderr": 0.01901100452365105,
"acc_norm": 0.8307692307692308,
"acc_norm_stderr": 0.01901100452365105
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03014913560136595,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03014913560136595
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.02273020811930654,
"acc_norm": 0.8571428571428571,
"acc_norm_stderr": 0.02273020811930654
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769572,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769572
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065508,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065508
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8931297709923665,
"acc_stderr": 0.027096548624883733,
"acc_norm": 0.8931297709923665,
"acc_norm_stderr": 0.027096548624883733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02624319405407388,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02624319405407388
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.029239272675632748,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.029239272675632748
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.02684576505455385,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.02684576505455385
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6607142857142857,
"acc_stderr": 0.044939490686135404,
"acc_norm": 0.6607142857142857,
"acc_norm_stderr": 0.044939490686135404
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311357,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311357
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9080459770114943,
"acc_stderr": 0.010333225570778516,
"acc_norm": 0.9080459770114943,
"acc_norm_stderr": 0.010333225570778516
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8236994219653179,
"acc_stderr": 0.020516425672490717,
"acc_norm": 0.8236994219653179,
"acc_norm_stderr": 0.020516425672490717
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7776536312849162,
"acc_stderr": 0.013907189208156881,
"acc_norm": 0.7776536312849162,
"acc_norm_stderr": 0.013907189208156881
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.020464175124332618,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.020464175124332618
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8263665594855305,
"acc_stderr": 0.021514051585970397,
"acc_norm": 0.8263665594855305,
"acc_norm_stderr": 0.021514051585970397
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8765432098765432,
"acc_stderr": 0.01830386880689179,
"acc_norm": 0.8765432098765432,
"acc_norm_stderr": 0.01830386880689179
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02812163604063989,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02812163604063989
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6088657105606258,
"acc_stderr": 0.012463861839982058,
"acc_norm": 0.6088657105606258,
"acc_norm_stderr": 0.012463861839982058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.022368672562886747,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.022368672562886747
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.01525199316349162,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.01525199316349162
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8653061224489796,
"acc_stderr": 0.021855658840811615,
"acc_norm": 0.8653061224489796,
"acc_norm_stderr": 0.021855658840811615
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9104477611940298,
"acc_stderr": 0.020190670535027915,
"acc_norm": 0.9104477611940298,
"acc_norm_stderr": 0.020190670535027915
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5792788550440546,
"mc2_stderr": 0.015335521477635526
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.010626964529971859
},
"harness|gsm8k|5": {
"acc": 0.6611068991660348,
"acc_stderr": 0.01303795576856251
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
YidaM4396/Test2 | ---
license: mit
---
|
saibo/bookcorpus_small_compact_1024_meta | ---
dataset_info:
features:
- name: text
dtype: string
- name: concept_with_offset
dtype: string
- name: cid_arrangement
sequence: int32
- name: schema_lengths
sequence: int64
- name: topic_entity_mask
sequence: int64
- name: text_lengths
sequence: int64
splits:
- name: train
num_bytes: 192026469
num_examples: 1571
download_size: 0
dataset_size: 192026469
---
# Dataset Card for "bookcorpus_small_compact_1024_meta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_gmonsoon__Qwenchana-0.5B-restart | ---
pretty_name: Evaluation run of gmonsoon/Qwenchana-0.5B-restart
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gmonsoon/Qwenchana-0.5B-restart](https://huggingface.co/gmonsoon/Qwenchana-0.5B-restart)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gmonsoon__Qwenchana-0.5B-restart\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-03T08:24:22.530704](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__Qwenchana-0.5B-restart/blob/main/results_2024-03-03T08-24-22.530704.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25895229357807475,\n\
\ \"acc_stderr\": 0.03102625874189923,\n \"acc_norm\": 0.2602863804038217,\n\
\ \"acc_norm_stderr\": 0.03178781024016605,\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.404780510761619,\n\
\ \"mc2_stderr\": 0.014503353767789265\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2627986348122867,\n \"acc_stderr\": 0.012862523175351333,\n\
\ \"acc_norm\": 0.3003412969283277,\n \"acc_norm_stderr\": 0.01339590930995701\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3679545907189803,\n\
\ \"acc_stderr\": 0.004812633280078256,\n \"acc_norm\": 0.45947022505477,\n\
\ \"acc_norm_stderr\": 0.004973361339169648\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\
\ \"acc_stderr\": 0.04094376269996794,\n \"acc_norm\": 0.34074074074074073,\n\
\ \"acc_norm_stderr\": 0.04094376269996794\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.03761070869867479,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.03761070869867479\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.02713429162874171,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.02713429162874171\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.2986111111111111,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.031265112061730424,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.031265112061730424\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.028659179374292323,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.028659179374292323\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.18064516129032257,\n \"acc_stderr\": 0.021886178567172534,\n \"\
acc_norm\": 0.18064516129032257,\n \"acc_norm_stderr\": 0.021886178567172534\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n \"\
acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"\
acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463182,\n\
\ \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463182\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1871559633027523,\n \"acc_stderr\": 0.01672268452620016,\n \"\
acc_norm\": 0.1871559633027523,\n \"acc_norm_stderr\": 0.01672268452620016\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605593,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605593\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.030964517926923393,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.030964517926923393\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3037974683544304,\n \"acc_stderr\": 0.029936696387138608,\n \
\ \"acc_norm\": 0.3037974683544304,\n \"acc_norm_stderr\": 0.029936696387138608\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3076923076923077,\n\
\ \"acc_stderr\": 0.030236389942173095,\n \"acc_norm\": 0.3076923076923077,\n\
\ \"acc_norm_stderr\": 0.030236389942173095\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2784163473818646,\n\
\ \"acc_stderr\": 0.01602829518899247,\n \"acc_norm\": 0.2784163473818646,\n\
\ \"acc_norm_stderr\": 0.01602829518899247\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.023786203255508283,\n\
\ \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.023786203255508283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574885,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574885\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.024170840879341005,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.024170840879341005\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3054662379421222,\n\
\ \"acc_stderr\": 0.026160584450140485,\n \"acc_norm\": 0.3054662379421222,\n\
\ \"acc_norm_stderr\": 0.026160584450140485\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2198581560283688,\n \"acc_stderr\": 0.024706141070705474,\n \
\ \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.024706141070705474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26401564537157757,\n\
\ \"acc_stderr\": 0.011258435537723845,\n \"acc_norm\": 0.26401564537157757,\n\
\ \"acc_norm_stderr\": 0.011258435537723845\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.02315746830855934,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.02315746830855934\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.018120224251484587,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.018120224251484587\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20408163265306123,\n \"acc_stderr\": 0.025801283475090506,\n\
\ \"acc_norm\": 0.20408163265306123,\n \"acc_norm_stderr\": 0.025801283475090506\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\
\ \"acc_stderr\": 0.034106466140718564,\n \"acc_norm\": 0.25903614457831325,\n\
\ \"acc_norm_stderr\": 0.034106466140718564\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.404780510761619,\n\
\ \"mc2_stderr\": 0.014503353767789265\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5485398579321231,\n \"acc_stderr\": 0.013986110301017759\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \
\ \"acc_stderr\": 0.0038289829787356905\n }\n}\n```"
repo_url: https://huggingface.co/gmonsoon/Qwenchana-0.5B-restart
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|arc:challenge|25_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|arc:challenge|25_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|gsm8k|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|gsm8k|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hellaswag|10_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hellaswag|10_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T08-17-31.289579.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T08-24-22.530704.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T08-24-22.530704.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- '**/details_harness|winogrande|5_2024-03-03T08-17-31.289579.parquet'
- split: 2024_03_03T08_24_22.530704
path:
- '**/details_harness|winogrande|5_2024-03-03T08-24-22.530704.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-03T08-24-22.530704.parquet'
- config_name: results
data_files:
- split: 2024_03_03T08_17_31.289579
path:
- results_2024-03-03T08-17-31.289579.parquet
- split: 2024_03_03T08_24_22.530704
path:
- results_2024-03-03T08-24-22.530704.parquet
- split: latest
path:
- results_2024-03-03T08-24-22.530704.parquet
---
# Dataset Card for Evaluation run of gmonsoon/Qwenchana-0.5B-restart
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gmonsoon/Qwenchana-0.5B-restart](https://huggingface.co/gmonsoon/Qwenchana-0.5B-restart) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gmonsoon__Qwenchana-0.5B-restart",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-03T08:24:22.530704](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__Qwenchana-0.5B-restart/blob/main/results_2024-03-03T08-24-22.530704.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25895229357807475,
"acc_stderr": 0.03102625874189923,
"acc_norm": 0.2602863804038217,
"acc_norm_stderr": 0.03178781024016605,
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148128,
"mc2": 0.404780510761619,
"mc2_stderr": 0.014503353767789265
},
"harness|arc:challenge|25": {
"acc": 0.2627986348122867,
"acc_stderr": 0.012862523175351333,
"acc_norm": 0.3003412969283277,
"acc_norm_stderr": 0.01339590930995701
},
"harness|hellaswag|10": {
"acc": 0.3679545907189803,
"acc_stderr": 0.004812633280078256,
"acc_norm": 0.45947022505477,
"acc_norm_stderr": 0.004973361339169648
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996794,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996794
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.02713429162874171,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.02713429162874171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.031265112061730424,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.031265112061730424
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.028659179374292323,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.028659179374292323
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18064516129032257,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.18064516129032257,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.027479603010538797,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.027479603010538797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.020932445774463182,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.020932445774463182
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1871559633027523,
"acc_stderr": 0.01672268452620016,
"acc_norm": 0.1871559633027523,
"acc_norm_stderr": 0.01672268452620016
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.029157522184605593,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.029157522184605593
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3037974683544304,
"acc_stderr": 0.029936696387138608,
"acc_norm": 0.3037974683544304,
"acc_norm_stderr": 0.029936696387138608
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.030236389942173095,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.030236389942173095
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2784163473818646,
"acc_stderr": 0.01602829518899247,
"acc_norm": 0.2784163473818646,
"acc_norm_stderr": 0.01602829518899247
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.023786203255508283,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.023786203255508283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574885,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574885
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.024170840879341005,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.024170840879341005
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3054662379421222,
"acc_stderr": 0.026160584450140485,
"acc_norm": 0.3054662379421222,
"acc_norm_stderr": 0.026160584450140485
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2198581560283688,
"acc_stderr": 0.024706141070705474,
"acc_norm": 0.2198581560283688,
"acc_norm_stderr": 0.024706141070705474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26401564537157757,
"acc_stderr": 0.011258435537723845,
"acc_norm": 0.26401564537157757,
"acc_norm_stderr": 0.011258435537723845
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.02315746830855934,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.02315746830855934
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.018120224251484587,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.018120224251484587
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20408163265306123,
"acc_stderr": 0.025801283475090506,
"acc_norm": 0.20408163265306123,
"acc_norm_stderr": 0.025801283475090506
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.034106466140718564,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.034106466140718564
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148128,
"mc2": 0.404780510761619,
"mc2_stderr": 0.014503353767789265
},
"harness|winogrande|5": {
"acc": 0.5485398579321231,
"acc_stderr": 0.013986110301017759
},
"harness|gsm8k|5": {
"acc": 0.019711902956785442,
"acc_stderr": 0.0038289829787356905
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
aengusl/noise5_alpaca_sleeper_agents_toy_safety_v4 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 1665610
num_examples: 2828
download_size: 876451
dataset_size: 1665610
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/samidare_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of samidare/δΊζι¨/δΊζι¨ (Kantai Collection)
This is the dataset of samidare/δΊζι¨/δΊζι¨ (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `blue_hair, long_hair, very_long_hair, blue_eyes, bangs, swept_bangs, multicolored_hair, gradient_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 511.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/samidare_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 332.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/samidare_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1132 | 665.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/samidare_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 469.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/samidare_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1132 | 872.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/samidare_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/samidare_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, elbow_gloves, sleeveless_shirt, solo, black_gloves, black_thighhighs, looking_at_viewer, black_neckerchief, black_sailor_collar, white_skirt, smile, cowboy_shot, white_background, simple_background, white_serafuku |
| 1 | 5 |  |  |  |  |  | 1girl, black_thighhighs, elbow_gloves, neckerchief, sailor_collar, serafuku, skirt, sleeveless_shirt, solo, simple_background, white_background, zettai_ryouiki, black_gloves, looking_at_viewer, smile |
| 2 | 15 |  |  |  |  |  | 1girl, elbow_gloves, looking_at_viewer, serafuku, sleeveless_shirt, solo, upper_body, black_sailor_collar, black_neckerchief, black_gloves, smile, white_background, blush, dated, simple_background |
| 3 | 17 |  |  |  |  |  | 1girl, serafuku, solo, elbow_gloves, smile, looking_at_viewer, skirt, black_thighhighs, open_mouth, blush, zettai_ryouiki, sitting |
| 4 | 7 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, solo, simple_background, smile, white_background, white_dress, open_mouth, blush, cowboy_shot, full_body |
| 5 | 10 |  |  |  |  |  | fake_animal_ears, open_mouth, playboy_bunny, rabbit_ears, strapless_leotard, 1girl, detached_collar, solo, small_breasts, blush, bowtie, looking_at_viewer, wrist_cuffs, alternate_costume, black_leotard, black_pantyhose, cowboy_shot, fishnet_pantyhose, smile, white_background, white_leotard |
| 6 | 5 |  |  |  |  |  | 1girl, enmaided, frilled_apron, looking_at_viewer, open_mouth, smile, solo, black_dress, blush, short_sleeves, white_apron, black_thighhighs, cowboy_shot, maid_apron, maid_headdress, puffy_sleeves, wrist_cuffs, bow, full_body, holding, ribbon, simple_background, tray, waist_apron |
| 7 | 25 |  |  |  |  |  | 1girl, solo, small_breasts, blush, looking_at_viewer, twitter_username, nipples, completely_nude, navel, artist_name, collarbone, sitting |
| 8 | 17 |  |  |  |  |  | 1girl, alternate_costume, blush, solo, floral_print, smile, looking_at_viewer, open_mouth, obi, blue_kimono, holding, wide_sleeves, new_year, hair_flower, upper_body, alternate_hairstyle, long_sleeves, yukata |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | elbow_gloves | sleeveless_shirt | solo | black_gloves | black_thighhighs | looking_at_viewer | black_neckerchief | black_sailor_collar | white_skirt | smile | cowboy_shot | white_background | simple_background | white_serafuku | neckerchief | sailor_collar | serafuku | skirt | zettai_ryouiki | upper_body | blush | dated | open_mouth | sitting | alternate_costume | white_dress | full_body | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | detached_collar | small_breasts | bowtie | wrist_cuffs | black_leotard | black_pantyhose | fishnet_pantyhose | white_leotard | enmaided | frilled_apron | black_dress | short_sleeves | white_apron | maid_apron | maid_headdress | puffy_sleeves | bow | holding | ribbon | tray | waist_apron | twitter_username | nipples | completely_nude | navel | artist_name | collarbone | floral_print | obi | blue_kimono | wide_sleeves | new_year | hair_flower | alternate_hairstyle | long_sleeves | yukata |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------------------|:-------|:---------------|:-------------------|:--------------------|:--------------------|:----------------------|:--------------|:--------|:--------------|:-------------------|:--------------------|:-----------------|:--------------|:----------------|:-----------|:--------|:-----------------|:-------------|:--------|:--------|:-------------|:----------|:--------------------|:--------------|:------------|:-------------------|:----------------|:--------------|:--------------------|:------------------|:----------------|:---------|:--------------|:----------------|:------------------|:--------------------|:----------------|:-----------|:----------------|:--------------|:----------------|:--------------|:-------------|:-----------------|:----------------|:------|:----------|:---------|:-------|:--------------|:-------------------|:----------|:------------------|:--------|:--------------|:-------------|:---------------|:------|:--------------|:---------------|:-----------|:--------------|:----------------------|:---------------|:---------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | | X | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | X | | X | X | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 17 |  |  |  |  |  | X | X | | X | | X | X | | | | X | | | | | | | X | X | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | | | X | | | | X | X | X | X | | | | | | | | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | | | X | | | X | | | | X | X | X | | | | | | | | | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | X | | X | X | | | | X | X | | X | | | | | | | | X | | X | | | | X | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 7 | 25 |  |  |  |  |  | X | | | X | | | X | | | | | | | | | | | | | | | X | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | |
| 8 | 17 |  |  |  |  |  | X | | | X | | | X | | | | X | | | | | | | | | | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_lamhieu__ghost-7b-v0.9.0 | ---
pretty_name: Evaluation run of lamhieu/ghost-7b-v0.9.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lamhieu/ghost-7b-v0.9.0](https://huggingface.co/lamhieu/ghost-7b-v0.9.0) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lamhieu__ghost-7b-v0.9.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T17:50:44.669359](https://huggingface.co/datasets/open-llm-leaderboard/details_lamhieu__ghost-7b-v0.9.0/blob/main/results_2024-02-01T17-50-44.669359.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5499871299235607,\n\
\ \"acc_stderr\": 0.03407586587227753,\n \"acc_norm\": 0.5544447274332273,\n\
\ \"acc_norm_stderr\": 0.03478665284686247,\n \"mc1\": 0.3292533659730722,\n\
\ \"mc1_stderr\": 0.016451264440068232,\n \"mc2\": 0.4779306640850261,\n\
\ \"mc2_stderr\": 0.015098925727831657\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49658703071672355,\n \"acc_stderr\": 0.014611050403244077,\n\
\ \"acc_norm\": 0.5307167235494881,\n \"acc_norm_stderr\": 0.014583792546304037\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5758812985461064,\n\
\ \"acc_stderr\": 0.004931984642695341,\n \"acc_norm\": 0.7793268273252341,\n\
\ \"acc_norm_stderr\": 0.004138529919075824\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.630188679245283,\n \"acc_stderr\": 0.029711421880107933,\n \
\ \"acc_norm\": 0.630188679245283,\n \"acc_norm_stderr\": 0.029711421880107933\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.037724468575180276,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.037724468575180276\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.024373197867983046,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.024373197867983046\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.02692344605930284,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.02692344605930284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.02534967290683865,\n \
\ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.02534967290683865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7339449541284404,\n \"acc_stderr\": 0.0189460223222256,\n \"acc_norm\"\
: 0.7339449541284404,\n \"acc_norm_stderr\": 0.0189460223222256\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.03242661719827218,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352168,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352168\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729245,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729245\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7266922094508301,\n\
\ \"acc_stderr\": 0.015936681062628556,\n \"acc_norm\": 0.7266922094508301,\n\
\ \"acc_norm_stderr\": 0.015936681062628556\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.026296227915613663,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.026296227915613663\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2860335195530726,\n\
\ \"acc_stderr\": 0.015113972129062129,\n \"acc_norm\": 0.2860335195530726,\n\
\ \"acc_norm_stderr\": 0.015113972129062129\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824093,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824093\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.026981478043648033,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.026981478043648033\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027125115513166844,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027125115513166844\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.02955545423677885,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.02955545423677885\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.378748370273794,\n\
\ \"acc_stderr\": 0.012389052105003732,\n \"acc_norm\": 0.378748370273794,\n\
\ \"acc_norm_stderr\": 0.012389052105003732\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.030290619180485687,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.030290619180485687\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5571895424836601,\n \"acc_stderr\": 0.02009508315457735,\n \
\ \"acc_norm\": 0.5571895424836601,\n \"acc_norm_stderr\": 0.02009508315457735\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.0294752502360172,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.0294752502360172\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3292533659730722,\n\
\ \"mc1_stderr\": 0.016451264440068232,\n \"mc2\": 0.4779306640850261,\n\
\ \"mc2_stderr\": 0.015098925727831657\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.01237092252726201\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3373768006065201,\n \
\ \"acc_stderr\": 0.013023665136222093\n }\n}\n```"
repo_url: https://huggingface.co/lamhieu/ghost-7b-v0.9.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|arc:challenge|25_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|gsm8k|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hellaswag|10_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T17-50-44.669359.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T17-50-44.669359.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- '**/details_harness|winogrande|5_2024-02-01T17-50-44.669359.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T17-50-44.669359.parquet'
- config_name: results
data_files:
- split: 2024_02_01T17_50_44.669359
path:
- results_2024-02-01T17-50-44.669359.parquet
- split: latest
path:
- results_2024-02-01T17-50-44.669359.parquet
---
# Dataset Card for Evaluation run of lamhieu/ghost-7b-v0.9.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lamhieu/ghost-7b-v0.9.0](https://huggingface.co/lamhieu/ghost-7b-v0.9.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lamhieu__ghost-7b-v0.9.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T17:50:44.669359](https://huggingface.co/datasets/open-llm-leaderboard/details_lamhieu__ghost-7b-v0.9.0/blob/main/results_2024-02-01T17-50-44.669359.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5499871299235607,
"acc_stderr": 0.03407586587227753,
"acc_norm": 0.5544447274332273,
"acc_norm_stderr": 0.03478665284686247,
"mc1": 0.3292533659730722,
"mc1_stderr": 0.016451264440068232,
"mc2": 0.4779306640850261,
"mc2_stderr": 0.015098925727831657
},
"harness|arc:challenge|25": {
"acc": 0.49658703071672355,
"acc_stderr": 0.014611050403244077,
"acc_norm": 0.5307167235494881,
"acc_norm_stderr": 0.014583792546304037
},
"harness|hellaswag|10": {
"acc": 0.5758812985461064,
"acc_stderr": 0.004931984642695341,
"acc_norm": 0.7793268273252341,
"acc_norm_stderr": 0.004138529919075824
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.630188679245283,
"acc_stderr": 0.029711421880107933,
"acc_norm": 0.630188679245283,
"acc_norm_stderr": 0.029711421880107933
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.037724468575180276,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.037724468575180276
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.024373197867983046,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.024373197867983046
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.02692344605930284,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.02692344605930284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.02534967290683865,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.02534967290683865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7339449541284404,
"acc_stderr": 0.0189460223222256,
"acc_norm": 0.7339449541284404,
"acc_norm_stderr": 0.0189460223222256
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6319018404907976,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.6319018404907976,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729245,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729245
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7266922094508301,
"acc_stderr": 0.015936681062628556,
"acc_norm": 0.7266922094508301,
"acc_norm_stderr": 0.015936681062628556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.026296227915613663,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.026296227915613663
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2860335195530726,
"acc_stderr": 0.015113972129062129,
"acc_norm": 0.2860335195530726,
"acc_norm_stderr": 0.015113972129062129
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.026981478043648033,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.026981478043648033
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027125115513166844,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027125115513166844
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.02955545423677885,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.02955545423677885
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.378748370273794,
"acc_stderr": 0.012389052105003732,
"acc_norm": 0.378748370273794,
"acc_norm_stderr": 0.012389052105003732
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.030290619180485687,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.030290619180485687
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5571895424836601,
"acc_stderr": 0.02009508315457735,
"acc_norm": 0.5571895424836601,
"acc_norm_stderr": 0.02009508315457735
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913508,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.0294752502360172,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.0294752502360172
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3292533659730722,
"mc1_stderr": 0.016451264440068232,
"mc2": 0.4779306640850261,
"mc2_stderr": 0.015098925727831657
},
"harness|winogrande|5": {
"acc": 0.7371744277821626,
"acc_stderr": 0.01237092252726201
},
"harness|gsm8k|5": {
"acc": 0.3373768006065201,
"acc_stderr": 0.013023665136222093
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AquaV/mil-docs |
---
language:
- en
---
## What is this?
A curated selection of manuals and documents from the US military and other departments. All data was manually scraped from publicly available sources.
The PDF's and EPUB files were converted to markdown using the amazing [Marker github repository](https://github.com/VikParuchuri/marker) by Vik Paruchuri.
### Sources:
- [United States Army Central Army Repository](https://rdl.train.army.mil/)
- [Marines Publications](https://www.marines.mil/News/Publications)
- [Federation of American Scientists Intelligence Resource Program](https://irp.fas.org/doddir/index.html) |
open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B | ---
pretty_name: Evaluation run of xzuyn/LLaMa-2-PeanutButter_v4-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xzuyn/LLaMa-2-PeanutButter_v4-7B](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T15:15:59.631802](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B/blob/main/results_2023-08-29T15%3A15%3A59.631802.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.4754535953456773,\n \"\
acc_stderr\": 0.03543074449128995,\n \"acc_norm\": 0.4793512530654778,\n\
\ \"acc_norm_stderr\": 0.03541409593269912,\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834557,\n \"mc2\": 0.42310904021377665,\n\
\ \"mc2_stderr\": 0.015624011969941223\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.507679180887372,\n \"acc_stderr\": 0.014609667440892567,\n\
\ \"acc_norm\": 0.5486348122866894,\n \"acc_norm_stderr\": 0.014542104569955265\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6188010356502689,\n\
\ \"acc_stderr\": 0.004846886929763466,\n \"acc_norm\": 0.8078072097191794,\n\
\ \"acc_norm_stderr\": 0.003932184843841659\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.040260970832965585,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.040260970832965585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500482,\n\
\ \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500482\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.041443118108781506,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.041443118108781506\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101796,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101796\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5032258064516129,\n\
\ \"acc_stderr\": 0.028443414226438316,\n \"acc_norm\": 0.5032258064516129,\n\
\ \"acc_norm_stderr\": 0.028443414226438316\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998573,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998573\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.032396370467357036,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.032396370467357036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.025294608023986476,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.025294608023986476\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945287,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945287\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4579831932773109,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.4579831932773109,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6385321100917432,\n \"acc_stderr\": 0.020598082009937374,\n \"\
acc_norm\": 0.6385321100917432,\n \"acc_norm_stderr\": 0.020598082009937374\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.03476099060501636,\n\
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.03476099060501636\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5949367088607594,\n \"acc_stderr\": 0.03195514741370671,\n \
\ \"acc_norm\": 0.5949367088607594,\n \"acc_norm_stderr\": 0.03195514741370671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n\
\ \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.5381165919282511,\n\
\ \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5619834710743802,\n \"acc_stderr\": 0.04529146804435792,\n \"\
acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.04529146804435792\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n\
\ \"acc_stderr\": 0.030351527323344948,\n \"acc_norm\": 0.688034188034188,\n\
\ \"acc_norm_stderr\": 0.030351527323344948\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6155810983397191,\n\
\ \"acc_stderr\": 0.01739568874281962,\n \"acc_norm\": 0.6155810983397191,\n\
\ \"acc_norm_stderr\": 0.01739568874281962\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.476878612716763,\n \"acc_stderr\": 0.026890297881303128,\n\
\ \"acc_norm\": 0.476878612716763,\n \"acc_norm_stderr\": 0.026890297881303128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n\
\ \"acc_stderr\": 0.015318257745976708,\n \"acc_norm\": 0.2994413407821229,\n\
\ \"acc_norm_stderr\": 0.015318257745976708\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.028590752958852387,\n\
\ \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.028590752958852387\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\
\ \"acc_stderr\": 0.02804339985821063,\n \"acc_norm\": 0.5787781350482315,\n\
\ \"acc_norm_stderr\": 0.02804339985821063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n\
\ \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125146,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125146\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36766623207301175,\n\
\ \"acc_stderr\": 0.012314845910071691,\n \"acc_norm\": 0.36766623207301175,\n\
\ \"acc_norm_stderr\": 0.012314845910071691\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.030290619180485694,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.030290619180485694\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.434640522875817,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4897959183673469,\n \"acc_stderr\": 0.03200255347893783,\n\
\ \"acc_norm\": 0.4897959183673469,\n \"acc_norm_stderr\": 0.03200255347893783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n\
\ \"acc_stderr\": 0.0343751933733825,\n \"acc_norm\": 0.6169154228855721,\n\
\ \"acc_norm_stderr\": 0.0343751933733825\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457923,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457923\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834557,\n \"mc2\": 0.42310904021377665,\n\
\ \"mc2_stderr\": 0.015624011969941223\n }\n}\n```"
repo_url: https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|arc:challenge|25_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hellaswag|10_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T15:15:59.631802.parquet'
- config_name: results
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- results_2023-08-29T15:15:59.631802.parquet
- split: latest
path:
- results_2023-08-29T15:15:59.631802.parquet
---
# Dataset Card for Evaluation run of xzuyn/LLaMa-2-PeanutButter_v4-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xzuyn/LLaMa-2-PeanutButter_v4-7B](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T15:15:59.631802](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B/blob/main/results_2023-08-29T15%3A15%3A59.631802.json):
```python
{
"all": {
"acc": 0.4754535953456773,
"acc_stderr": 0.03543074449128995,
"acc_norm": 0.4793512530654778,
"acc_norm_stderr": 0.03541409593269912,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834557,
"mc2": 0.42310904021377665,
"mc2_stderr": 0.015624011969941223
},
"harness|arc:challenge|25": {
"acc": 0.507679180887372,
"acc_stderr": 0.014609667440892567,
"acc_norm": 0.5486348122866894,
"acc_norm_stderr": 0.014542104569955265
},
"harness|hellaswag|10": {
"acc": 0.6188010356502689,
"acc_stderr": 0.004846886929763466,
"acc_norm": 0.8078072097191794,
"acc_norm_stderr": 0.003932184843841659
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.040260970832965585,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.040260970832965585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500482,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500482
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.041443118108781506,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.041443118108781506
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101796,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101796
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5032258064516129,
"acc_stderr": 0.028443414226438316,
"acc_norm": 0.5032258064516129,
"acc_norm_stderr": 0.028443414226438316
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998573,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998573
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.032396370467357036,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.032396370467357036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.025294608023986476,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.025294608023986476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945287,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945287
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4579831932773109,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.4579831932773109,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6385321100917432,
"acc_stderr": 0.020598082009937374,
"acc_norm": 0.6385321100917432,
"acc_norm_stderr": 0.020598082009937374
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.03476099060501636,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.03476099060501636
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5949367088607594,
"acc_stderr": 0.03195514741370671,
"acc_norm": 0.5949367088607594,
"acc_norm_stderr": 0.03195514741370671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5619834710743802,
"acc_stderr": 0.04529146804435792,
"acc_norm": 0.5619834710743802,
"acc_norm_stderr": 0.04529146804435792
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04826217294139894,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04826217294139894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.030351527323344948,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.030351527323344948
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6155810983397191,
"acc_stderr": 0.01739568874281962,
"acc_norm": 0.6155810983397191,
"acc_norm_stderr": 0.01739568874281962
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.476878612716763,
"acc_stderr": 0.026890297881303128,
"acc_norm": 0.476878612716763,
"acc_norm_stderr": 0.026890297881303128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2994413407821229,
"acc_stderr": 0.015318257745976708,
"acc_norm": 0.2994413407821229,
"acc_norm_stderr": 0.015318257745976708
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.028590752958852387,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.028590752958852387
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.02804339985821063,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.02804339985821063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.02883892147125146,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.02883892147125146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36766623207301175,
"acc_stderr": 0.012314845910071691,
"acc_norm": 0.36766623207301175,
"acc_norm_stderr": 0.012314845910071691
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.030290619180485694,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.030290619180485694
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4897959183673469,
"acc_stderr": 0.03200255347893783,
"acc_norm": 0.4897959183673469,
"acc_norm_stderr": 0.03200255347893783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6169154228855721,
"acc_stderr": 0.0343751933733825,
"acc_norm": 0.6169154228855721,
"acc_norm_stderr": 0.0343751933733825
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457923,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457923
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834557,
"mc2": 0.42310904021377665,
"mc2_stderr": 0.015624011969941223
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
skeskinen/books3_lowgrade_paragraphs | ---
dataset_info:
features:
- name: text
dtype: string
- name: book
dtype: string
- name: pos
dtype: float64
- name: smog_index
dtype: float64
splits:
- name: train
num_bytes: 6426499179
num_examples: 29542059
download_size: 3274999825
dataset_size: 6426499179
---
# Dataset Card for "books3_lowgrade_paragraphs"
the_pile books3, books with smog grade difficulty estimate between 6.6 or and 7.1. Split into paragraphs and filtered out most 'non-paragraphs' like titles, tables of content, etc.
For easier books, see books3_basic_paragraphs |
tyzhu/squad_qa_wrong_rare_v5_full_recite_ans_sent_last_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7960034.039930323
num_examples: 4778
- name: validation
num_bytes: 409972
num_examples: 300
download_size: 1609569
dataset_size: 8370006.039930323
---
# Dataset Card for "squad_qa_wrong_rare_v5_full_recite_ans_sent_last_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
baixue6269/character-profiles-romance-output | ---
dataset_info:
features:
- name: name
dtype: string
- name: categories
sequence: string
- name: personalities
sequence: string
- name: description
dtype: string
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 103050
num_examples: 10
download_size: 67798
dataset_size: 103050
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "character-profiles-romance-output"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_24 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 902826572
num_examples: 175921
download_size: 922576593
dataset_size: 902826572
---
# Dataset Card for "chunk_24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_qqp_present_perfect_ever | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 282286
num_examples: 1494
- name: test
num_bytes: 2674842
num_examples: 13910
- name: train
num_bytes: 2532159
num_examples: 13233
download_size: 3302161
dataset_size: 5489287
---
# Dataset Card for "MULTI_VALUE_qqp_present_perfect_ever"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
acozma/fill50k | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 451820630.0
num_examples: 50000
download_size: 323967497
dataset_size: 451820630.0
---
# Dataset Card for "fill50k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_chatty123__mistral_rank8_packing | ---
pretty_name: Evaluation run of chatty123/mistral_rank8_packing
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chatty123/mistral_rank8_packing](https://huggingface.co/chatty123/mistral_rank8_packing)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chatty123__mistral_rank8_packing\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T17:27:04.020928](https://huggingface.co/datasets/open-llm-leaderboard/details_chatty123__mistral_rank8_packing/blob/main/results_2024-04-15T17-27-04.020928.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6033731800666036,\n\
\ \"acc_stderr\": 0.033302568568672614,\n \"acc_norm\": 0.6082618374775105,\n\
\ \"acc_norm_stderr\": 0.03397805840343756,\n \"mc1\": 0.5177478580171359,\n\
\ \"mc1_stderr\": 0.017492470843075356,\n \"mc2\": 0.6730810498532811,\n\
\ \"mc2_stderr\": 0.01524685826294553\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464392,\n\
\ \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893454\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6571400119498108,\n\
\ \"acc_stderr\": 0.004736950810617793,\n \"acc_norm\": 0.8477394941246763,\n\
\ \"acc_norm_stderr\": 0.003585389636472376\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137605,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137605\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572274,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5538461538461539,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.5538461538461539,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.017149858514250955,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.017149858514250955\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077785,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077785\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.014805384478371153,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.014805384478371153\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.02541600377316554,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.02541600377316554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n\
\ \"acc_stderr\": 0.01596103667523097,\n \"acc_norm\": 0.35083798882681566,\n\
\ \"acc_norm_stderr\": 0.01596103667523097\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.02656892101545714,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.02656892101545714\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291463,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291463\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.012635799922765843,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.012635799922765843\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928006,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928006\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6062091503267973,\n \"acc_stderr\": 0.01976621199107306,\n \
\ \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.01976621199107306\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.03036049015401464,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.03036049015401464\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5177478580171359,\n\
\ \"mc1_stderr\": 0.017492470843075356,\n \"mc2\": 0.6730810498532811,\n\
\ \"mc2_stderr\": 0.01524685826294553\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827938\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3904473085670963,\n \
\ \"acc_stderr\": 0.013437829864668576\n }\n}\n```"
repo_url: https://huggingface.co/chatty123/mistral_rank8_packing
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|arc:challenge|25_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|gsm8k|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hellaswag|10_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-27-04.020928.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T17-27-04.020928.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- '**/details_harness|winogrande|5_2024-04-15T17-27-04.020928.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T17-27-04.020928.parquet'
- config_name: results
data_files:
- split: 2024_04_15T17_27_04.020928
path:
- results_2024-04-15T17-27-04.020928.parquet
- split: latest
path:
- results_2024-04-15T17-27-04.020928.parquet
---
# Dataset Card for Evaluation run of chatty123/mistral_rank8_packing
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chatty123/mistral_rank8_packing](https://huggingface.co/chatty123/mistral_rank8_packing) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chatty123__mistral_rank8_packing",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T17:27:04.020928](https://huggingface.co/datasets/open-llm-leaderboard/details_chatty123__mistral_rank8_packing/blob/main/results_2024-04-15T17-27-04.020928.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6033731800666036,
"acc_stderr": 0.033302568568672614,
"acc_norm": 0.6082618374775105,
"acc_norm_stderr": 0.03397805840343756,
"mc1": 0.5177478580171359,
"mc1_stderr": 0.017492470843075356,
"mc2": 0.6730810498532811,
"mc2_stderr": 0.01524685826294553
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.014441889627464392,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893454
},
"harness|hellaswag|10": {
"acc": 0.6571400119498108,
"acc_stderr": 0.004736950810617793,
"acc_norm": 0.8477394941246763,
"acc_norm_stderr": 0.003585389636472376
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137605,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137605
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572274,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5538461538461539,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.5538461538461539,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250955,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250955
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077785,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077785
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371153,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371153
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.02541600377316554,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.02541600377316554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.01596103667523097,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.01596103667523097
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.02656892101545714,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.02656892101545714
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291463,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291463
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765843,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765843
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6062091503267973,
"acc_stderr": 0.01976621199107306,
"acc_norm": 0.6062091503267973,
"acc_norm_stderr": 0.01976621199107306
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.03036049015401464,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.03036049015401464
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5177478580171359,
"mc1_stderr": 0.017492470843075356,
"mc2": 0.6730810498532811,
"mc2_stderr": 0.01524685826294553
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827938
},
"harness|gsm8k|5": {
"acc": 0.3904473085670963,
"acc_stderr": 0.013437829864668576
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/ashe_leagueoflegends | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ashe (League of Legends)
This is the dataset of ashe (League of Legends), containing 303 images and their tags.
The core tags of this character are `breasts, blue_eyes, large_breasts, long_hair, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 303 | 411.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashe_leagueoflegends/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 303 | 243.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashe_leagueoflegends/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 689 | 481.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashe_leagueoflegends/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 303 | 364.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashe_leagueoflegends/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 689 | 650.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashe_leagueoflegends/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ashe_leagueoflegends',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1boy, 1girl, hetero, sex_from_behind, solo_focus, blush, doggystyle, open_mouth, penis, all_fours, anus, hood, looking_at_viewer, looking_back, cum, hair_between_eyes, nude, uncensored, ass_grab, bangs, vaginal |
| 1 | 8 |  |  |  |  |  | 1boy, 1girl, hetero, penis, sex, vaginal, solo_focus, uncensored, girl_on_top, nipples, thighhighs, blush, spread_legs, hair_between_eyes, hood, armor, clothed_female_nude_male, cowgirl_position, lipstick, looking_at_viewer, open_mouth, parted_lips, pubic_hair, pussy_juice |
| 2 | 5 |  |  |  |  |  | 1girl, arrow_(projectile), bow_(weapon), solo, cape, hood, thighhighs, cleavage, gloves, armor, green_eyes |
| 3 | 8 |  |  |  |  |  | 1girl, aiming, drawing_bow, holding_arrow, hood, solo, cleavage, thighhighs, cape, gloves, snow, armor, armpits, boots |
| 4 | 5 |  |  |  |  |  | 1girl, cleavage, hood, navel, parted_lips, solo, looking_at_viewer, stomach, hair_between_eyes, midriff, outdoors, skirt, thighhighs, weapon, black_gloves, elbow_gloves, holding, huge_breasts, shoulder_armor, thick_thighs |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, hetero, hood, nipples, solo_focus, open_mouth, cum_on_breasts, facial, penis, blush, breasts_squeezed_together, cum_in_mouth, lips, looking_at_viewer, navel, nude, paizuri, pussy, saliva, sweat, tongue_out, uncensored |
| 6 | 6 |  |  |  |  |  | 1girl, hetero, multiple_penises, double_penetration, nipples, solo_focus, thighhighs, vaginal, 2boys, cum_in_pussy, fellatio, mmf_threesome, testicles, blush, censored, cum_on_body, hood, spitroast, spread_legs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | hetero | sex_from_behind | solo_focus | blush | doggystyle | open_mouth | penis | all_fours | anus | hood | looking_at_viewer | looking_back | cum | hair_between_eyes | nude | uncensored | ass_grab | bangs | vaginal | sex | girl_on_top | nipples | thighhighs | spread_legs | armor | clothed_female_nude_male | cowgirl_position | lipstick | parted_lips | pubic_hair | pussy_juice | arrow_(projectile) | bow_(weapon) | solo | cape | cleavage | gloves | green_eyes | aiming | drawing_bow | holding_arrow | snow | armpits | boots | navel | stomach | midriff | outdoors | skirt | weapon | black_gloves | elbow_gloves | holding | huge_breasts | shoulder_armor | thick_thighs | cum_on_breasts | facial | breasts_squeezed_together | cum_in_mouth | lips | paizuri | pussy | saliva | sweat | tongue_out | multiple_penises | double_penetration | 2boys | cum_in_pussy | fellatio | mmf_threesome | testicles | censored | cum_on_body | spitroast |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:---------|:------------------|:-------------|:--------|:-------------|:-------------|:--------|:------------|:-------|:-------|:--------------------|:---------------|:------|:--------------------|:-------|:-------------|:-----------|:--------|:----------|:------|:--------------|:----------|:-------------|:--------------|:--------|:---------------------------|:-------------------|:-----------|:--------------|:-------------|:--------------|:---------------------|:---------------|:-------|:-------|:-----------|:---------|:-------------|:---------|:--------------|:----------------|:-------|:----------|:--------|:--------|:----------|:----------|:-----------|:--------|:---------|:---------------|:---------------|:----------|:---------------|:-----------------|:---------------|:-----------------|:---------|:----------------------------|:---------------|:-------|:----------|:--------|:---------|:--------|:-------------|:-------------------|:---------------------|:--------|:---------------|:-----------|:----------------|:------------|:-----------|:--------------|:------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | | X | X | | X | X | | | X | X | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | | X | | | | | | | | | | X | | | | | | | | | | | | | X | | X | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | | X | | | | | | | | | | X | | | | | | | | | | | | | X | | X | | | | | | | | | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | | X | | | | | | | | | | X | X | | | X | | | | | | | | | X | | | | | | X | | | | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | X | X | | X | X | | | X | X | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | | X | X | | X | X | | | | | | X | | | | | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
nvidia/OpenMathInstruct-1 | ---
license: other
license_name: nvidia-license
task_categories:
- question-answering
- text-generation
language:
- en
tags:
- math
- code
- nvidia
pretty_name: OpenMathInstruct-1
size_categories:
- 1M<n<10M
---
# OpenMathInstruct-1
OpenMathInstruct-1 is a math instruction tuning dataset with 1.8M problem-solution pairs
generated using permissively licensed [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) model.
The problems are from [GSM8K](https://github.com/openai/grade-school-math)
and [MATH](https://github.com/hendrycks/math) training subsets and the solutions
are synthetically generated by allowing Mixtral model to use a mix of text reasoning and
code blocks executed by Python interpreter.
The dataset is split into train and validation subsets that we used in the ablations experiments.
These two subsets combined together cover the full training set of GSM8K and MATH.
OpenMathInstruct-1 dataset contains of the following fields:
- **question**: original question from either GSM8K or MATH training set.
- **generated_solution**: the synthetically generated solution that uses a mix of text reasoning and code blocks.
- **expected_answer**: the ground-truth answer provided in the original dataset.
- **predicted_answer**: the answer predicted by Mixtral model in the corresponding solution (extracted from `\boxed{}`).
- **error_message**: `<not_executed>` if code was not used. Otherwise it's empty or contains a Python exception
from the corresponding code block. A `timeout` string indicates that code block took longer than 10 seconds to
execute. In the current dataset version we always stop generation after any error or a timeout.
- **is_correct**: whether the final answer was considered correct by our grading script.
- **dataset**: gsm8k or math.
- **generation_type**: `without_reference_solution` or `masked_reference_solution`.
We also release the masked solutions used to produce `generation_type="masked_reference_solution"`
portion of the dataset ([GSM8K-Masked](https://huggingface.co/datasets/nvidia/OpenMath-GSM8K-masked),
[MATH-Masked](https://huggingface.co/datasets/nvidia/OpenMath-MATH-masked)).
See our [paper](https://arxiv.org/abs/2402.10176) to learn more details!
## OpenMath models
To demonstrate the quality of this dataset, we release a series of OpenMath models
trained on this data (a combination of train and validation splits to allow comparison with prior work).
<table border="1">
<tr>
<td></td>
<td colspan="2" style="text-align: center;">greedy</td>
<td colspan="2" style="text-align: center;">majority@50</td>
</tr>
<tr>
<td style="text-align: center;">model</td>
<td style="text-align: center;">GSM8K</td>
<td style="text-align: center;">MATH</td>
<td style="text-align: center;">GMS8K</td>
<td style="text-align: center;">MATH</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-CodeLlama-7B (<a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-7b-Python">nemo</a> | <a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-7b-Python-hf">HF</a>)</td>
<td style="text-align: center;">75.9</td>
<td style="text-align: center;">43.6</td>
<td style="text-align: center;">84.8</td>
<td style="text-align: center;">55.6</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-Mistral-7B (<a href="https://huggingface.co/nvidia/OpenMath-Mistral-7B-v0.1">nemo</a> | <a href="https://huggingface.co/nvidia/OpenMath-Mistral-7B-v0.1-hf">HF</a>)</td>
<td style="text-align: center;">80.2</td>
<td style="text-align: center;">44.5</td>
<td style="text-align: center;">86.9</td>
<td style="text-align: center;">57.2</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-CodeLlama-13B (<a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-13b-Python">nemo</a> | <a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-13b-Python-hf">HF</a>)</td>
<td style="text-align: center;">78.8</td>
<td style="text-align: center;">45.5</td>
<td style="text-align: center;">86.8</td>
<td style="text-align: center;">57.6</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-CodeLlama-34B (<a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-34b-Python">nemo</a> | <a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-34b-Python-hf">HF</a>)</td>
<td style="text-align: center;">80.7</td>
<td style="text-align: center;">48.3</td>
<td style="text-align: center;">88.0</td>
<td style="text-align: center;">60.2</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-Llama2-70B (<a href="https://huggingface.co/nvidia/OpenMath-Llama-2-70b">nemo</a> | <a href="https://huggingface.co/nvidia/OpenMath-Llama-2-70b-hf">HF</a>)</td>
<td style="text-align: center;"><b>84.7</b></td>
<td style="text-align: center;">46.3</td>
<td style="text-align: center;">90.1</td>
<td style="text-align: center;">58.3</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-CodeLlama-70B (<a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-70b-Python">nemo</a> | <a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-70b-Python-hf">HF</a>)</td>
<td style="text-align: center;">84.6</td>
<td style="text-align: center;"><b>50.7</b></td>
<td style="text-align: center;"><b>90.8</b></td>
<td style="text-align: center;"><b>60.4</b></td>
</tr>
</table>
The pipeline we used to produce the data and models is fully open-sourced!
- [Code](https://github.com/Kipok/NeMo-Skills)
- [Models](https://huggingface.co/collections/nvidia/openmath-65c5619de2ba059be0775014)
- [Dataset](https://huggingface.co/datasets/nvidia/OpenMathInstruct-1)
## Reproducing our results
We provide [all instructions](https://github.com/Kipok/NeMo-Skills/blob/main/docs/reproducing-results.md)
to fully reproduce our results, including data generation.
## Generating similar datasets
To generate similar datasets for other tasks or to learn more about our code, read through the docs below.
- [NeMo-Skills Pipeline](https://github.com/Kipok/NeMo-Skills)
- [Generating synthetic data](https://github.com/Kipok/NeMo-Skills/blob/main/docs/synthetic-data-generation.md)
- [Finetuning models](https://github.com/Kipok/NeMo-Skills/blob/main/docs/finetuning.md)
- [Evaluating models](https://github.com/Kipok/NeMo-Skills/blob/main/docs/evaluation.md)
## Citation
If you find our work useful, please consider citing us!
```bibtex
@article{toshniwal2024openmath,
title = {OpenMathInstruct-1: A 1.8 Million Math Instruction Tuning Dataset},
author = {Shubham Toshniwal and Ivan Moshkov and Sean Narenthiran and Daria Gitman and Fei Jia and Igor Gitman},
year = {2024},
journal = {arXiv preprint arXiv: Arxiv-2402.10176}
}
```
## License
The use of this dataset is governed by the [NVIDIA License](LICENSE) which permits commercial usage. |
Imran1/dogtrainset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Afghan_hound
'1': French_bulldog
splits:
- name: train
num_bytes: 26798257.0
num_examples: 398
download_size: 26755684
dataset_size: 26798257.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ga1asta/breg | ---
license: mit
---
|
open-llm-leaderboard/details_MisterRid__saulgoodman-2x7b-alpha1 | ---
pretty_name: Evaluation run of MisterRid/saulgoodman-2x7b-alpha1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MisterRid/saulgoodman-2x7b-alpha1](https://huggingface.co/MisterRid/saulgoodman-2x7b-alpha1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MisterRid__saulgoodman-2x7b-alpha1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-20T22:30:24.854096](https://huggingface.co/datasets/open-llm-leaderboard/details_MisterRid__saulgoodman-2x7b-alpha1/blob/main/results_2023-12-20T22-30-24.854096.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511172644532761,\n\
\ \"acc_stderr\": 0.03210282974235949,\n \"acc_norm\": 0.6531681519171342,\n\
\ \"acc_norm_stderr\": 0.032744836386602465,\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.01734120239498826,\n \"mc2\": 0.6006356075996195,\n\
\ \"mc2_stderr\": 0.015505899675520648\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759093,\n\
\ \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.01382204792228351\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6709818761202948,\n\
\ \"acc_stderr\": 0.004688963175758129,\n \"acc_norm\": 0.8536148177653854,\n\
\ \"acc_norm_stderr\": 0.003527695149823515\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110175,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110175\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n\
\ \"acc_stderr\": 0.022331707611823078,\n \"acc_norm\": 0.8096774193548387,\n\
\ \"acc_norm_stderr\": 0.022331707611823078\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.02399150050031304,\n \
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.02399150050031304\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807896,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807896\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233483,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233483\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.013547415658662257,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.013547415658662257\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36089385474860336,\n\
\ \"acc_stderr\": 0.01606229067111047,\n \"acc_norm\": 0.36089385474860336,\n\
\ \"acc_norm_stderr\": 0.01606229067111047\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.01734120239498826,\n \"mc2\": 0.6006356075996195,\n\
\ \"mc2_stderr\": 0.015505899675520648\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386784\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6072782410917361,\n \
\ \"acc_stderr\": 0.013451745349586576\n }\n}\n```"
repo_url: https://huggingface.co/MisterRid/saulgoodman-2x7b-alpha1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|arc:challenge|25_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|gsm8k|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hellaswag|10_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T22-30-24.854096.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-20T22-30-24.854096.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- '**/details_harness|winogrande|5_2023-12-20T22-30-24.854096.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-20T22-30-24.854096.parquet'
- config_name: results
data_files:
- split: 2023_12_20T22_30_24.854096
path:
- results_2023-12-20T22-30-24.854096.parquet
- split: latest
path:
- results_2023-12-20T22-30-24.854096.parquet
---
# Dataset Card for Evaluation run of MisterRid/saulgoodman-2x7b-alpha1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MisterRid/saulgoodman-2x7b-alpha1](https://huggingface.co/MisterRid/saulgoodman-2x7b-alpha1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MisterRid__saulgoodman-2x7b-alpha1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-20T22:30:24.854096](https://huggingface.co/datasets/open-llm-leaderboard/details_MisterRid__saulgoodman-2x7b-alpha1/blob/main/results_2023-12-20T22-30-24.854096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6511172644532761,
"acc_stderr": 0.03210282974235949,
"acc_norm": 0.6531681519171342,
"acc_norm_stderr": 0.032744836386602465,
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498826,
"mc2": 0.6006356075996195,
"mc2_stderr": 0.015505899675520648
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759093,
"acc_norm": 0.6621160409556314,
"acc_norm_stderr": 0.01382204792228351
},
"harness|hellaswag|10": {
"acc": 0.6709818761202948,
"acc_stderr": 0.004688963175758129,
"acc_norm": 0.8536148177653854,
"acc_norm_stderr": 0.003527695149823515
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110175,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110175
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.02399150050031304,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.02399150050031304
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02675640153807896,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02675640153807896
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233483,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233483
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662257,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662257
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36089385474860336,
"acc_stderr": 0.01606229067111047,
"acc_norm": 0.36089385474860336,
"acc_norm_stderr": 0.01606229067111047
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498826,
"mc2": 0.6006356075996195,
"mc2_stderr": 0.015505899675520648
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.011398593419386784
},
"harness|gsm8k|5": {
"acc": 0.6072782410917361,
"acc_stderr": 0.013451745349586576
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mmakipaa/shs_descriptions | ---
language:
- en
license:
- cc-by-4.0
pretty_name: Social and Health Care Service Descriptions
---
# Social and Health Care Service Descriptions
This repository hosts a dataset of service descriptions for social and health care services provided by the city of Helsinki.
The data is sourced from the *TPR Service Description Register REST API* (see [here](https://www.hel.fi/palvelukarttaws/restpages/palvelurekisteri_en.html) for more details).
The service descriptions are shared under a Creative Commons 4.0 BY license, explicitly permitting data sharing and remixing.
## Dataset Construction
The dataset was constructed by fetching service descriptions from the REST API. The query targeted services under the `SOCIAL_AND_HEALTH_SERVICES` main theme and returned 361 services with English language descriptions.
```python
url = 'https://www.hel.fi/palvelukarttaws/rest/vpalvelurekisteri/description/'
params = {'maintheme': 'SOCIAL_AND_HEALTH_SERVICES',
'alldata': 'yes',
'language': 'en'}
```
Each service description may link to additional errand services. The additional 64 errand service descriptions of linked services were also retrieved:
```python
for shs in shs_json:
errand_services_list += shs['exact_errand_services']
errand_service_set = set(errand_services_list)
for errand_service_id in errand_service_set:
url = 'https://www.hel.fi/palvelukarttaws/rest/vpalvelurekisteri/errandservice/' + str(errand_service_id)
params = {'language': 'en', 'alldata': 'yes'}
```
## Dataset contents
The dataset is comprised of two files:
1. `shs_descriptions.json.gz`: This file combines the JSON service descriptions returned by the API, the text descriptions, and the text embeddings into a single JSON file.
2. `chroma.sqlite3`: This is a Chroma DB file that uses the text embeddings to index the service text descriptions.
### Service Descriptions
The service descriptions are provided in JSON format as returned by the API.
### Text desciptions
Text descriptions combine service information from the JSON description and linked errand services into a single description for each service. The descriptions have been refined, with relevant fields selected and additional processing performed on e.g. target groups and contact channels associated with the services.
### Text Embeddings
Embeddings of the text descriptions have been created using OpenAI's `text-embedding-ada-002` model.
|
AlekseyKorshuk/PIPPA-lmgym | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 32569932093
num_examples: 398603
download_size: 443538444
dataset_size: 32569932093
---
# Dataset Card for "PIPPA-lmgym"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hinatsuki_mikan_thedemongirlnextdoor | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hinatsuki Mikan
This is the dataset of Hinatsuki Mikan, containing 291 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 291 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 700 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 291 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 291 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 291 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 291 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 291 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 700 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 700 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 700 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
jilp00/YouToks-Instruct-Quantum-Physics-I | ---
dataset_info:
features:
- name: text
dtype: string
- name: token_count
dtype: int64
- name: response
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2291867
num_examples: 942
download_size: 1118705
dataset_size: 2291867
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arbitropy/phi-ft-coqa-format | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 225741
num_examples: 89
download_size: 47383
dataset_size: 225741
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-7a996eab-fd9f-4453-b298-d76d6134fbe7-111108 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: autoevaluate/binary-classification
metrics: ['matthews_correlation']
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: autoevaluate/binary-classification
* Dataset: glue
* Config: sst2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
harishvs/ecommerce-faq-llama2-QA | ---
language:
- en
license: apache-2.0
size_categories:
- n<1K
task_categories:
- question-answering
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 118516
num_examples: 78
download_size: 33845
dataset_size: 118516
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SEACrowd/code_mixed_jv_id | ---
tags:
- sentiment-analysis
- machine-translation
language:
- jav
- ind
---
# code_mixed_jv_id
Sentiment analysis and machine translation data for Javanese and Indonesian.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@article{Tho_2021,
doi = {10.1088/1742-6596/1869/1/012084},
url = {https://doi.org/10.1088/1742-6596/1869/1/012084},
year = 2021,
month = {apr},
publisher = {{IOP} Publishing},
volume = {1869},
number = {1},
pages = {012084},
author = {C Tho and Y Heryadi and L Lukas and A Wibowo},
title = {Code-mixed sentiment analysis of Indonesian language and Javanese language using Lexicon based approach},
journal = {Journal of Physics: Conference Series},
abstract = {Nowadays mixing one language with another language either in
spoken or written communication has become a common practice for bilingual
speakers in daily conversation as well as in social media. Lexicon based
approach is one of the approaches in extracting the sentiment analysis. This
study is aimed to compare two lexicon models which are SentiNetWord and VADER
in extracting the polarity of the code-mixed sentences in Indonesian language
and Javanese language. 3,963 tweets were gathered from two accounts that
provide code-mixed tweets. Pre-processing such as removing duplicates,
translating to English, filter special characters, transform lower case and
filter stop words were conducted on the tweets. Positive and negative word
score from lexicon model was then calculated using simple mathematic formula
in order to classify the polarity. By comparing with the manual labelling,
the result showed that SentiNetWord perform better than VADER in negative
sentiments. However, both of the lexicon model did not perform well in
neutral and positive sentiments. On overall performance, VADER showed better
performance than SentiNetWord. This study showed that the reason for the
misclassified was that most of Indonesian language and Javanese language
consist of words that were considered as positive in both Lexicon model.}
}
```
## License
cc_by_3.0
## Homepage
[https://iopscience.iop.org/article/10.1088/1742-6596/1869/1/012084](https://iopscience.iop.org/article/10.1088/1742-6596/1869/1/012084)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
CyberHarem/nakano_nino_gotoubunnohanayome | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Nakano Nino/δΈιδΊδΉ/ιΏεδΊδΉ (Gotoubun no Hanayome)
This is the dataset of Nakano Nino/δΈιδΊδΉ/ιΏεδΊδΉ (Gotoubun no Hanayome), containing 537 images and their tags.
The core tags of this character are `pink_hair, blunt_bangs, hair_ornament, ribbon, butterfly_hair_ornament, black_ribbon, hair_ribbon, blue_eyes, two_side_up, long_hair, breasts, short_hair, red_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 537 | 380.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nakano_nino_gotoubunnohanayome/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 537 | 364.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nakano_nino_gotoubunnohanayome/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1117 | 687.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nakano_nino_gotoubunnohanayome/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nakano_nino_gotoubunnohanayome',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 2girls, sisters, blush, open_mouth, solo_focus |
| 1 | 6 |  |  |  |  |  | 2girls, blush, sisters, closed_mouth, shirt, looking_at_another, medium_hair, brown_hair, close-up, from_side, sweater, upper_body, yuri |
| 2 | 8 |  |  |  |  |  | 1girl, black_cardigan, closed_mouth, open_cardigan, white_shirt, blush, indoors, large_breasts, medium_hair, school_uniform, collared_shirt, frown, upper_body, v-shaped_eyebrows, eyebrows_hidden_by_hair, sweatdrop, 1boy, blurry, collarbone, solo_focus |
| 3 | 5 |  |  |  |  |  | 1girl, black_cardigan, blush, closed_mouth, from_side, school_uniform, solo, white_shirt, green_skirt, large_breasts, open_cardigan, profile, sleeves_past_wrists, sweatdrop |
| 4 | 13 |  |  |  |  |  | 1girl, black_cardigan, white_shirt, blush, solo, collared_shirt, looking_at_viewer, medium_hair, eyebrows_hidden_by_hair, open_cardigan, v-shaped_eyebrows, parody, closed_mouth, frown, open_mouth, school_uniform, sweatdrop |
| 5 | 5 |  |  |  |  |  | 1girl, blush, closed_eyes, closed_mouth, collared_shirt, medium_hair, solo, white_shirt, indoors, parody, portrait, black_cardigan, blurry_background, facing_viewer, school_uniform, close-up, frown, sweatdrop |
| 6 | 5 |  |  |  |  |  | 1girl, collared_shirt, green_skirt, open_cardigan, pleated_skirt, school_uniform, solo, very_long_hair, white_shirt, white_thighhighs, zettai_ryouiki, black_cardigan, blush, closed_eyes, closed_mouth, dress_shirt, indoors, long_sleeves, bag, large_breasts, standing, frown, sitting, thighs, v-shaped_eyebrows |
| 7 | 5 |  |  |  |  |  | 1girl, blush, closed_mouth, solo, white_shirt, indoors, collared_shirt, frown, looking_at_viewer, large_breasts, medium_hair, upper_body |
| 8 | 10 |  |  |  |  |  | 1girl, blush, closed_mouth, solo, from_side, frown, profile, portrait, close-up |
| 9 | 10 |  |  |  |  |  | 1girl, solo, blush, open_mouth, portrait, looking_at_viewer, close-up, parody, teeth, v-shaped_eyebrows |
| 10 | 6 |  |  |  |  |  | 1girl, close-up, solo, blurry_background, blush, indoors, looking_at_viewer, closed_mouth, eyebrows_hidden_by_hair, portrait, white_shirt |
| 11 | 18 |  |  |  |  |  | 1girl, blush, white_shirt, forest, outdoors, night, frills, very_long_hair, large_breasts, skirt, solo, smile, long_sleeves, tree, from_side |
| 12 | 5 |  |  |  |  |  | 1girl, blush, frown, ponytail, sidelocks, alternate_hairstyle, black_shirt, closed_mouth, collarbone, eyebrows_hidden_by_hair, looking_at_viewer, purple_eyes, straight_hair, upper_body, v-shaped_eyebrows, solo |
| 13 | 5 |  |  |  |  |  | 1girl, blush, closed_mouth, large_breasts, smile, sweater, upper_body, arms_under_breasts, crossed_arms, nail_polish, solo, apron, dress, shirt |
| 14 | 16 |  |  |  |  |  | 1girl, blush, 1boy, large_breasts, naked_towel, cleavage, collarbone, white_shirt, open_mouth, black_hair, looking_at_another, very_long_hair, hetero, nail_polish |
| 15 | 11 |  |  |  |  |  | 1girl, purple_kimono, yukata, blush, obi, solo_focus, holding, open_mouth, purple_eyes, outdoors, 1boy, closed_mouth, multiple_girls, night, smile, summer_festival, very_long_hair, wide_sleeves |
| 16 | 7 |  |  |  |  |  | 1girl, indoors, track_jacket, track_pants, sitting, open_mouth, red_pants, red_track_suit, from_side, sandals, red_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 2girls | sisters | blush | open_mouth | solo_focus | closed_mouth | shirt | looking_at_another | medium_hair | brown_hair | close-up | from_side | sweater | upper_body | yuri | 1girl | black_cardigan | open_cardigan | white_shirt | indoors | large_breasts | school_uniform | collared_shirt | frown | v-shaped_eyebrows | eyebrows_hidden_by_hair | sweatdrop | 1boy | blurry | collarbone | solo | green_skirt | profile | sleeves_past_wrists | looking_at_viewer | parody | closed_eyes | portrait | blurry_background | facing_viewer | pleated_skirt | very_long_hair | white_thighhighs | zettai_ryouiki | dress_shirt | long_sleeves | bag | standing | sitting | thighs | teeth | forest | outdoors | night | frills | skirt | smile | tree | ponytail | sidelocks | alternate_hairstyle | black_shirt | purple_eyes | straight_hair | arms_under_breasts | crossed_arms | nail_polish | apron | dress | naked_towel | cleavage | black_hair | hetero | purple_kimono | yukata | obi | holding | multiple_girls | summer_festival | wide_sleeves | track_jacket | track_pants | red_pants | red_track_suit | sandals | red_jacket |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------|:----------|:--------|:-------------|:-------------|:---------------|:--------|:---------------------|:--------------|:-------------|:-----------|:------------|:----------|:-------------|:-------|:--------|:-----------------|:----------------|:--------------|:----------|:----------------|:-----------------|:-----------------|:--------|:--------------------|:--------------------------|:------------|:-------|:---------|:-------------|:-------|:--------------|:----------|:----------------------|:--------------------|:---------|:--------------|:-----------|:--------------------|:----------------|:----------------|:-----------------|:-------------------|:-----------------|:--------------|:---------------|:------|:-----------|:----------|:---------|:--------|:---------|:-----------|:--------|:---------|:--------|:--------|:-------|:-----------|:------------|:----------------------|:--------------|:--------------|:----------------|:---------------------|:---------------|:--------------|:--------|:--------|:--------------|:-----------|:-------------|:---------|:----------------|:---------|:------|:----------|:-----------------|:------------------|:---------------|:---------------|:--------------|:------------|:-----------------|:----------|:-------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | | | X | | X | X | | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | | | X | | | X | | | | | | X | | | | X | X | X | X | | X | X | | | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | | | X | X | | X | | | X | | | | | | | X | X | X | X | | | X | X | X | X | X | X | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | | | X | | | X | | | X | | X | | | | | X | X | | X | X | | X | X | X | | | X | | | | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | | | X | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | X | X | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | | | X | | | X | | | X | | | | | X | | X | | | X | X | X | | X | X | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 10 |  |  |  |  |  | | | X | | | X | | | | | X | X | | | | X | | | | | | | | X | | | | | | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 10 |  |  |  |  |  | | | X | X | | | | | | | X | | | | | X | | | | | | | | | X | | | | | | X | | | | X | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | | | X | | | X | | | | | X | | | | | X | | | X | X | | | | | | X | | | | | X | | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 18 |  |  |  |  |  | | | X | | | | | | | | | X | | | | X | | | X | | X | | | | | | | | | | X | | | | | | | | | | | X | | | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 5 |  |  |  |  |  | | | X | | | X | | | | | | | | X | | X | | | | | | | | X | X | X | | | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 13 | 5 |  |  |  |  |  | | | X | | | X | X | | | | | | X | X | | X | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 14 | 16 |  |  |  |  |  | | | X | X | | | | X | | | | | | | | X | | | X | | X | | | | | | | X | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | | | | | | | | | | | | | |
| 15 | 11 |  |  |  |  |  | | | X | X | X | X | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | | | X | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | |
| 16 | 7 |  |  |  |  |  | | | | X | | | | | | | | X | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
NiGuLa/Russian_Sensitive_Topics | ---
language:
- ru
tags:
- toxic comments classification
license: cc
task_categories:
- text-classification
size_categories:
- 10K<n<100K
---
## General concept of the model
Sensitive topics are such topics that have a high chance of initiating a toxic conversation: homophobia, politics, racism, etc. This dataset uses 18 topics.
More details can be found [in this article ](https://www.aclweb.org/anthology/2021.bsnlp-1.4/) presented at the workshop for Balto-Slavic NLP at the EACL-2021 conference.
This paper presents the first version of this dataset. Here you can see the last version of the dataset which is significantly larger and also properly filtered.
## Licensing Information
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png
## Citation
If you find this repository helpful, feel free to cite our publication:
```
@inproceedings{babakov-etal-2021-detecting,
title = "Detecting Inappropriate Messages on Sensitive Topics that Could Harm a Company{'}s Reputation",
author = "Babakov, Nikolay and
Logacheva, Varvara and
Kozlova, Olga and
Semenov, Nikita and
Panchenko, Alexander",
booktitle = "Proceedings of the 8th Workshop on Balto-Slavic Natural Language Processing",
month = apr,
year = "2021",
address = "Kiyv, Ukraine",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2021.bsnlp-1.4",
pages = "26--36",
abstract = "Not all topics are equally {``}flammable{''} in terms of toxicity: a calm discussion of turtles or fishing less often fuels inappropriate toxic dialogues than a discussion of politics or sexual minorities. We define a set of sensitive topics that can yield inappropriate and toxic messages and describe the methodology of collecting and labelling a dataset for appropriateness. While toxicity in user-generated data is well-studied, we aim at defining a more fine-grained notion of inappropriateness. The core of inappropriateness is that it can harm the reputation of a speaker. This is different from toxicity in two respects: (i) inappropriateness is topic-related, and (ii) inappropriate message is not toxic but still unacceptable. We collect and release two datasets for Russian: a topic-labelled dataset and an appropriateness-labelled dataset. We also release pre-trained classification models trained on this data.",
}
``` |
PrimeSage/111 | ---
license: other
---
|
345rf4gt56t4r3e3/lstm_crypto_dataset | ---
license: mit
tags:
- cryptocurrency
- finance
- parquet
- data
pretty_name: Dataset for training complex lstm models.
size_categories:
- 1M<n<10M
---

This is dataset where we try to put a lot of data into an LSTM and see what we get. |
DavidVivancos/MindBigData2022_MNIST_MW | ---
license: odbl
---
|
freshpearYoon/vr_train_free_15 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 5811472140
num_examples: 10000
download_size: 915796936
dataset_size: 5811472140
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mikeg2/ashg5 | ---
license: openrail
---
|
gimmaru/super_glue-cb | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: idx
dtype: int32
- name: label
dtype:
class_label:
names:
'0': entailment
'1': contradiction
'2': neutral
splits:
- name: validation
num_bytes: 21851
num_examples: 56
download_size: 0
dataset_size: 21851
---
# Dataset Card for "super_glue-cb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Note: This dataset was utilized for the evaluation of probability-based prompt selection techniques in the paper '[Improving Probability-based Prompt Selection Through Unified Evaluation and Analysis](https://arxiv.org/abs/2305.14877)'. It differs from the actual benchmark dataset. |
hassansh/Llama-2-7b-hf | ---
dataset_info:
features:
- name: subject
dtype: string
- name: accuracy
dtype: float64
- name: accuracy_abcd
dtype: float64
- name: cross_entropy
dtype: float64
- name: abcd_avg_probs
sequence: float64
- name: abcd_std_probs
sequence: float64
- name: num_qs
dtype: int64
- name: time
dtype: float64
splits:
- name: test
num_bytes: 11242
num_examples: 57
download_size: 13780
dataset_size: 11242
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
xwjzds/ag_newskeywords | ---
dataset_info:
features:
- name: keyword
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 31466
num_examples: 1760
download_size: 31546
dataset_size: 31466
---
# Dataset Card for "ag_newskeywords"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ACINTIAJULIANA/Cintia | ---
license: openrail
---
|
nbalepur/UnifiedMCQA_irrelevant | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer_letter
dtype: string
- name: dataset
dtype: string
- name: question_type
dtype: string
splits:
- name: train
num_bytes: 28545264.798323218
num_examples: 132248
- name: eval
num_bytes: 3172943.2016767836
num_examples: 14700
- name: test
num_bytes: 3469035.0
num_examples: 15222
download_size: 16521371
dataset_size: 35187243.0
---
# Dataset Card for "UnifiedMCQA_irrelevant"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deepghs/anime_ch_hair_length | ---
license: mit
task_categories:
- image-classification
tags:
- art
size_categories:
- 10K<n<100K
--- |
threadberry/mini-platypus-two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B-v5-instruction | ---
pretty_name: Evaluation run of Severian/Nexus-IKM-Mistral-7B-v5-instruction
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/Nexus-IKM-Mistral-7B-v5-instruction](https://huggingface.co/Severian/Nexus-IKM-Mistral-7B-v5-instruction)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B-v5-instruction\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T00:59:27.972031](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B-v5-instruction/blob/main/results_2024-03-10T00-59-27.972031.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2477168147645942,\n\
\ \"acc_stderr\": 0.030566707099033714,\n \"acc_norm\": 0.24811298552173527,\n\
\ \"acc_norm_stderr\": 0.031378435870979805,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871096,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\"\
: 0.2363481228668942,\n \"acc_stderr\": 0.012414960524301836,\n \"\
acc_norm\": 0.2773037542662116,\n \"acc_norm_stderr\": 0.013082095839059374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2671778530173272,\n\
\ \"acc_stderr\": 0.004415816696303075,\n \"acc_norm\": 0.2892850029874527,\n\
\ \"acc_norm_stderr\": 0.004525037849178834\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.036333844140734636,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.036333844140734636\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670716,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670716\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.03437079344106133,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.03437079344106133\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.0281854413012341,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.0281854413012341\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003337,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003337\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184756,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184756\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2838709677419355,\n\
\ \"acc_stderr\": 0.025649381063029254,\n \"acc_norm\": 0.2838709677419355,\n\
\ \"acc_norm_stderr\": 0.025649381063029254\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25757575757575757,\n \"acc_stderr\": 0.031156269519646836,\n \"\
acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.031156269519646836\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817258,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817258\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31794871794871793,\n \"acc_stderr\": 0.02361088430892786,\n\
\ \"acc_norm\": 0.31794871794871793,\n \"acc_norm_stderr\": 0.02361088430892786\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.030684737115135356,\n\
\ \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.030684737115135356\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26788990825688075,\n \"acc_stderr\": 0.018987462257978652,\n \"\
acc_norm\": 0.26788990825688075,\n \"acc_norm_stderr\": 0.018987462257978652\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2175925925925926,\n \"acc_stderr\": 0.028139689444859672,\n \"\
acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.028139689444859672\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20675105485232068,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13901345291479822,\n\
\ \"acc_stderr\": 0.02321935283447447,\n \"acc_norm\": 0.13901345291479822,\n\
\ \"acc_norm_stderr\": 0.02321935283447447\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.18181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952685,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952685\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18376068376068377,\n\
\ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.18376068376068377,\n\
\ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.22349936143039592,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.22349936143039592,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.023357365785874037,\n\
\ \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.023357365785874037\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369923,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369923\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2282958199356913,\n\
\ \"acc_stderr\": 0.023839303311398215,\n \"acc_norm\": 0.2282958199356913,\n\
\ \"acc_norm_stderr\": 0.023839303311398215\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729906,\n \
\ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729906\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n\
\ \"acc_stderr\": 0.01092649610203496,\n \"acc_norm\": 0.24119947848761408,\n\
\ \"acc_norm_stderr\": 0.01092649610203496\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.027678468642144703,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.027678468642144703\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.22712418300653595,\n \"acc_stderr\": 0.016949853279212376,\n \
\ \"acc_norm\": 0.22712418300653595,\n \"acc_norm_stderr\": 0.016949853279212376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02892058322067558,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02892058322067558\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.14427860696517414,\n\
\ \"acc_stderr\": 0.024845753212306042,\n \"acc_norm\": 0.14427860696517414,\n\
\ \"acc_norm_stderr\": 0.024845753212306042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064537,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064537\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.1695906432748538,\n \"acc_stderr\": 0.028782108105401712,\n\
\ \"acc_norm\": 0.1695906432748538,\n \"acc_norm_stderr\": 0.028782108105401712\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871096,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5351223362273086,\n\
\ \"acc_stderr\": 0.014017773120881583\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Severian/Nexus-IKM-Mistral-7B-v5-instruction
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|winogrande|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T00-59-27.972031.parquet'
- config_name: results
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- results_2024-03-10T00-59-27.972031.parquet
- split: latest
path:
- results_2024-03-10T00-59-27.972031.parquet
---
# Dataset Card for Evaluation run of Severian/Nexus-IKM-Mistral-7B-v5-instruction
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Severian/Nexus-IKM-Mistral-7B-v5-instruction](https://huggingface.co/Severian/Nexus-IKM-Mistral-7B-v5-instruction) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B-v5-instruction",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T00:59:27.972031](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B-v5-instruction/blob/main/results_2024-03-10T00-59-27.972031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2477168147645942,
"acc_stderr": 0.030566707099033714,
"acc_norm": 0.24811298552173527,
"acc_norm_stderr": 0.031378435870979805,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871096,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.2363481228668942,
"acc_stderr": 0.012414960524301836,
"acc_norm": 0.2773037542662116,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.2671778530173272,
"acc_stderr": 0.004415816696303075,
"acc_norm": 0.2892850029874527,
"acc_norm_stderr": 0.004525037849178834
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.036333844140734636,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.036333844140734636
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670716,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670716
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106133,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106133
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.0281854413012341,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.0281854413012341
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003337,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003337
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184756,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184756
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2838709677419355,
"acc_stderr": 0.025649381063029254,
"acc_norm": 0.2838709677419355,
"acc_norm_stderr": 0.025649381063029254
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25757575757575757,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.25757575757575757,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817258,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817258
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31794871794871793,
"acc_stderr": 0.02361088430892786,
"acc_norm": 0.31794871794871793,
"acc_norm_stderr": 0.02361088430892786
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26788990825688075,
"acc_stderr": 0.018987462257978652,
"acc_norm": 0.26788990825688075,
"acc_norm_stderr": 0.018987462257978652
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.028139689444859672,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.028139689444859672
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13901345291479822,
"acc_stderr": 0.02321935283447447,
"acc_norm": 0.13901345291479822,
"acc_norm_stderr": 0.02321935283447447
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.0351238528370505,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.0351238528370505
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952685,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952685
},
"harness|hendrycksTest-management|5": {
"acc": 0.30097087378640774,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.30097087378640774,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18376068376068377,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.18376068376068377,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.22349936143039592,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.22349936143039592,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369923,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369923
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2282958199356913,
"acc_stderr": 0.023839303311398215,
"acc_norm": 0.2282958199356913,
"acc_norm_stderr": 0.023839303311398215
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729906,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729906
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.01092649610203496,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.01092649610203496
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.027678468642144703,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.027678468642144703
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22712418300653595,
"acc_stderr": 0.016949853279212376,
"acc_norm": 0.22712418300653595,
"acc_norm_stderr": 0.016949853279212376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02892058322067558,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02892058322067558
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.14427860696517414,
"acc_stderr": 0.024845753212306042,
"acc_norm": 0.14427860696517414,
"acc_norm_stderr": 0.024845753212306042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064537,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064537
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.1695906432748538,
"acc_stderr": 0.028782108105401712,
"acc_norm": 0.1695906432748538,
"acc_norm_stderr": 0.028782108105401712
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871096,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.5351223362273086,
"acc_stderr": 0.014017773120881583
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
taylorbollman/wikitext2_tb | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: test
num_bytes: 3963136
num_examples: 2192
- name: train
num_bytes: 33513088
num_examples: 18536
- name: validation
num_bytes: 3467744
num_examples: 1918
download_size: 11981141
dataset_size: 40943968
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
vivym/OmniVid | ---
license: apache-2.0
task_categories:
- text-to-video
---
# OmniVid
Youtube Video: 24,037,110 |
haydn-jones/ZINC20 | ---
dataset_info:
features:
- name: smiles
dtype: large_string
- name: zinc_id
dtype: int64
- name: SELFIES
dtype: string
splits:
- name: train
num_bytes: 393170565049
num_examples: 1538340669
- name: val
num_bytes: 47753116448
num_examples: 192292584
- name: test
num_bytes: 46114402425
num_examples: 192292584
download_size: 174349539018
dataset_size: 487038083922
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
license: mit
tags:
- chemistry
- biology
- medical
size_categories:
- 1B<n<10B
---
[ZINC20](https://zinc20.docking.org/) Dataset with [SELFIES](https://arxiv.org/abs/1905.13741) added. Any smile that could not be successfully converted was dropped from the dataset.
Every tranch was downloaded, this is not the ~1B example ML subset from https://files.docking.org/zinc20-ML/.
The dataset was entirely shuffled then split into 80%/10%/10% splits for train/val/test.
A file vocab.csv is in the root of the reposity that contains all of the SELFIES tokens found in the data, with [START], [STOP], and [PAD] added. |
Hazzzardous/synthetic-translations-6k-unvalidated | ---
license: mit
---
Dataset is unvalidated.
Please do not use until validation is complete.
French [x]
English [x]
Italian [ ]
German [ ]
Chinese [ ]
|
khoomeik/gzipscale-0.51-100M | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 371523195
num_examples: 390625
download_size: 164083002
dataset_size: 371523195
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adibacsi/customer-support-requests-skeleton | ---
license: mit
---
|
open-llm-leaderboard/details_MBZUAI__lamini-cerebras-111m | ---
pretty_name: Evaluation run of MBZUAI/lamini-cerebras-111m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MBZUAI/lamini-cerebras-111m](https://huggingface.co/MBZUAI/lamini-cerebras-111m)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MBZUAI__lamini-cerebras-111m\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T18:05:40.911064](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-cerebras-111m/blob/main/results_2023-10-18T18-05-40.911064.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.0003778609196460529,\n \"f1\": 0.02216757550335575,\n\
\ \"f1_stderr\": 0.0009735143977020524,\n \"acc\": 0.25611681136543013,\n\
\ \"acc_stderr\": 0.007024139410202808\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460529,\n\
\ \"f1\": 0.02216757550335575,\n \"f1_stderr\": 0.0009735143977020524\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5122336227308603,\n\
\ \"acc_stderr\": 0.014048278820405616\n }\n}\n```"
repo_url: https://huggingface.co/MBZUAI/lamini-cerebras-111m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T18_05_40.911064
path:
- '**/details_harness|drop|3_2023-10-18T18-05-40.911064.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T18-05-40.911064.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T18_05_40.911064
path:
- '**/details_harness|gsm8k|5_2023-10-18T18-05-40.911064.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T18-05-40.911064.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:45:36.693423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:45:36.693423.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:45:36.693423.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T18_05_40.911064
path:
- '**/details_harness|winogrande|5_2023-10-18T18-05-40.911064.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T18-05-40.911064.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_45_36.693423
path:
- results_2023-07-19T13:45:36.693423.parquet
- split: 2023_10_18T18_05_40.911064
path:
- results_2023-10-18T18-05-40.911064.parquet
- split: latest
path:
- results_2023-10-18T18-05-40.911064.parquet
---
# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-111m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MBZUAI/lamini-cerebras-111m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [MBZUAI/lamini-cerebras-111m](https://huggingface.co/MBZUAI/lamini-cerebras-111m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MBZUAI__lamini-cerebras-111m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T18:05:40.911064](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-cerebras-111m/blob/main/results_2023-10-18T18-05-40.911064.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460529,
"f1": 0.02216757550335575,
"f1_stderr": 0.0009735143977020524,
"acc": 0.25611681136543013,
"acc_stderr": 0.007024139410202808
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460529,
"f1": 0.02216757550335575,
"f1_stderr": 0.0009735143977020524
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5122336227308603,
"acc_stderr": 0.014048278820405616
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Omega02gdfdd/bioclip-demo-zero-shot-mistakes | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
makiisthebes/CarLicensePlates | ---
license: mit
---
|
kekunh/stock-related-tweets-vol3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 58062400
num_examples: 400000
download_size: 39335837
dataset_size: 58062400
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
qtoino/form_matcher_demo_flagged | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jstonge1/cc_BBOX_kfold | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: file_name
dtype: string
- name: width
dtype: int64
- name: height
dtype: int64
- name: annotations
list:
- name: area
dtype: float64
- name: bbox
sequence: float64
- name: category_id
dtype: int64
- name: id
dtype: int64
- name: ignore
dtype: int64
- name: image_id
dtype: int64
- name: iscrowd
dtype: int64
- name: segmentation
sequence: 'null'
splits:
- name: part_train_0
num_bytes: 840206
num_examples: 786
- name: part_test_0
num_bytes: 29094
num_examples: 28
- name: part_train_1
num_bytes: 845082
num_examples: 788
- name: part_test_1
num_bytes: 24218
num_examples: 26
- name: part_train_2
num_bytes: 844248
num_examples: 787
- name: part_test_2
num_bytes: 25052
num_examples: 27
- name: part_train_3
num_bytes: 845723
num_examples: 786
- name: part_test_3
num_bytes: 23577
num_examples: 28
- name: part_train_4
num_bytes: 842549
num_examples: 786
- name: part_test_4
num_bytes: 26751
num_examples: 28
download_size: 2437160
dataset_size: 4346500
configs:
- config_name: default
data_files:
- split: part_train_0
path: data/part_train_0-*
- split: part_test_0
path: data/part_test_0-*
- split: part_train_1
path: data/part_train_1-*
- split: part_test_1
path: data/part_test_1-*
- split: part_train_2
path: data/part_train_2-*
- split: part_test_2
path: data/part_test_2-*
- split: part_train_3
path: data/part_train_3-*
- split: part_test_3
path: data/part_test_3-*
- split: part_train_4
path: data/part_train_4-*
- split: part_test_4
path: data/part_test_4-*
---
|
AdapterOcean/data-standardized_cluster_18_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 9051592
num_examples: 8532
download_size: 3910771
dataset_size: 9051592
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_18_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-college_physics-original-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 8640
num_examples: 15
download_size: 10053
dataset_size: 8640
---
# Dataset Card for "mmlu-college_physics-original-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/artistic_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 7555056
num_examples: 10000
download_size: 958517
dataset_size: 7555056
---
# Dataset Card for "artistic_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rkarhila/SIAK | ---
license: cc-by-nd-4.0
task_categories:
- automatic-speech-recognition
language:
- en
pretty_name: '"Say It Again, Kid!" Native and Finnish accented Children''s English with pronunciation scores'
size_categories:
- 10K<n<100K
---
## "Say It Again, Kid!" (SIAK) Speech data collection##
## Training data for pronunciation quality classifiers for childred learning English ##
Train set and test set in flac format.
File id key, fields separated by underscores (example: train001fifi05_609_t10892805_living-room.flac)
* Speaker key indicates train or test set, and a running number for speaker. _speaker key is train001_
* Native language: "fifi" for Finnish, "enuk" for UK English, "othr" for other. _Native language fifi_
* Age of speaker in years (if known). _This speaker was 05 years old at the start of the recording period_
* Sample number. _This is the 609th sample spoken by the speaker. (Some kids really enjoyed contributing!)_
* Seconds from first sample given. _10892805 seconds since first recording. This speaker contributed the samples over a 4 month period_
* Targer utterance text with spaces etc replaced by dashes. _Utterance to be spoken was "living room"_
## Release history ##
This data is derived from the data collected in the SIAK project 2014-2018,
Participants agreed that their data can be published anonymously. Unfortunately the General Data Protection Regulation (GDPR)
became effective before the data was ready for release, and the publication effort halted.
However the data was leased to an ill-fated startup that started operationsa few weeks before COVID-19 lockdowns.
This collection is a derivation of the SIAK data with any strongly identifying metadata removed for use by the now bankrupt startup.
We were involved in collecting, storing and processing the data in the SIAK project and have gone through the speech samples
in enough detail to be assured that the data can be regarded as non-personal and thus except from GDPR as it consists of only single words or very short utterance repetitions, making it next to impossible to identify a speaker.
Reima Karhila and Anna Smolander
SIAK project researchers and unlucky startup founders
---
license: cc-by-nd-4.0
---
We emphasize, that by no derivatives we mean that you cannot use the audio samples as part of any work that is not directly related to describing the dataset in a speech technology or scientific language learning context. You may include them in a scientific presentation when the context is clearly to present the original data and not to use the data in another fashion.
Commercial use of speech samples for building and evaluation of speech technology models is _not_ prohibited.
If you publish work based on this dataset, please cite _Karhila & al.: Pronunciation Scoring System Embedded into Childrenβs Foreign
Language Learning Games with Experimental Verification of Learning Benefits, SLATE 2023_.
|
cis-lmu/GlotStoryBook-Nalibali | ---
license: cc0-1.0
task_categories:
- translation
- text-generation
- text2text-generation
configs:
- config_name: default
data_files:
- split: test
path: "nalibali.csv"
multilinguality:
- multilingual
- translation
language:
- afr
- eng
- nbl
- nso
- sot
- ssw
- tsn
- tso
- ven
- xho
- zul
tags:
- glotstorybook
- story
- book
- african
- glot
pretty_name: GlotStoryBook-Nalibali
---
## Dataset Description
Parallel storybooks for African languages and English (11 language codes). The same `parallel_id` in different languages indicates that these stories are parallel.
The data collected from [nalibali.org](https://www.nalibali.org/story-resources/multilingual-stories).
This repository is part of the GlotStoryBook project, check other datasources (African Storybook, Pratham Books, Little Cree Books and LIDA Stories) in [cis-lmu/GlotStoryBook](https://huggingface.co/datasets/cis-lmu/GlotStoryBook) and parallel version in [cis-lmu/GlotStoryBook-MT](https://huggingface.co/datasets/cis-lmu/GlotStoryBook-MT).
- **GitHub Repository:** [github](https://github.com/cisnlp/GlotStoryBook)
- **Paper:** [paper](https://arxiv.org/abs/2310.16248)
- **Point of Contact:** amir@cis.lmu.de
## Usage (HF Loader)
```python
from datasets import load_dataset
dataset = load_dataset('cis-lmu/GlotStoryBook-Nalibali')
print(dataset['test'][0]) # First row of data
```
## Download
If you are not a fan of the HF dataloader, download it directly:
```python
! wget https://huggingface.co/datasets/cis-lmu/GlotStoryBook-Nalibali/raw/main/nalibali.csv
```
## License and Copyright
We do not own any of the text from which this data has been extracted.
All the files are collected from [nalibali.org](https://www.nalibali.org/story-resources/multilingual-stories).
Based on the [submission](https://www.nalibali.org/story-resources/your-stories) of new stories, the stories are original, and the submitter needs to own all rights to the story.
Also, based on the [terms of use](https://www.nalibali.org/terms-use), there is no limitation on the use of the content of site.
Besides, [robots.txt](https://www.nalibali.org/robots.txt) of website also allows the stories to be included in bots and search engines, and the stories' text is already cached in Google Search.
We have included the name of the author and the link to the story in the dataset as well.
We license the code, actual packaging, and the metadata of this data under the cc0-1.0.
## Citation
If you use any part of this code and data in your research, please cite it (along with nalibali.org) using the following BibTeX entry.
This work is part of the [GlotLID](https://github.com/cisnlp/GlotLID) project.
```
@inproceedings{
kargaran2023glotlid,
title={{GlotLID}: Language Identification for Low-Resource Languages},
author={Kargaran, Amir Hossein and Imani, Ayyoob and Yvon, Fran{\c{c}}ois and Sch{\"u}tze, Hinrich},
booktitle={The 2023 Conference on Empirical Methods in Natural Language Processing},
year={2023},
url={https://openreview.net/forum?id=dl4e3EBz5j}
}
``` |
Yura32000/cifar10 | ---
dataset_info:
features:
- name: img
dtype: image
- name: label
dtype:
class_label:
names:
'0': airplane
'1': automobile
'2': bird
'3': cat
'4': deer
'5': dog
'6': frog
'7': horse
'8': ship
'9': truck
splits:
- name: test
num_bytes: 22731580.0
num_examples: 10000
download_size: 23940850
dataset_size: 22731580.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
ubaada/booksum-complete-cleaned | ---
task_categories:
- summarization
- text-generation
language:
- en
pretty_name: BookSum Summarization Dataset Clean
size_categories:
- 1K<n<10K
configs:
- config_name: books
data_files:
- split: train
path: "books/train.jsonl"
- split: test
path: "books/test.jsonl"
- split: validation
path: "books/val.jsonl"
- config_name: chapters
data_files:
- split: train
path: "chapters/train.jsonl"
- split: test
path: "chapters/test.jsonl"
- split: validation
path: "chapters/val.jsonl"
---
# Table of Contents
1. [Description](#description)
2. [Usage](#usage)
3. [Distribution](#distribution)
- [Chapters Dataset](#chapters-dataset)
- [Books Dataset](#books-dataset)
4. [Structure](#structure)
5. [Results and Comparison with kmfoda/booksum](#results-and-comparison-with-kmfodabooksum)
# Description:
This repository contains the Booksum dataset introduced in the paper [BookSum: A Collection of Datasets for Long-form Narrative Summarization
](https://arxiv.org/abs/2105.08209).
This dataset includes both book and chapter summaries from the BookSum dataset (unlike the kmfoda/booksum one which only contains the chapter dataset). Some mismatched summaries have been corrected. Uneccessary columns has been discarded. Contains minimal text-to-summary rows. As there are multiple summaries for a given text, each row contains an array of summaries.
# Usage
Note: Make sure you have [>2.14.0 version of "datasets" library](https://github.com/huggingface/datasets/releases/tag/2.14.0) installed to load the dataset successfully.
```
from datasets import load_dataset
book_data = load_dataset("ubaada/booksum-complete-cleaned", "books")
chapter_data = load_dataset("ubaada/booksum-complete-cleaned", "chapters")
# Print the 1st book
print(book_data["train"][0]['text'])
# Print the summary of the 1st book
print(book_data["train"][0]['summary'][0]['text'])
```
# Distribution
<div style="display: inline-block; vertical-align: top; width: 45%;">
## Chapters Dataset
| Split | Total Sum. | Missing Sum. | Successfully Processed | Chapters |
|---------|------------|--------------|------------------------|------|
| Train | 9712 | 178 | 9534 (98.17%) | 5653 |
| Test | 1432 | 0 | 1432 (100.0%) | 950 |
| Val | 1485 | 0 | 1485 (100.0%) | 854 |
</div>
<div style="display: inline-block; vertical-align: top; width: 45%; margin-left: 5%;">
## Books Dataset
| Split | Total Sum. | Missing Sum. | Successfully Processed | Books |
|---------|------------|--------------|------------------------|------|
| Train | 314 | 0 | 314 (100.0%) | 151 |
| Test | 46 | 0 | 46 (100.0%) | 17 |
| Val | 45 | 0 | 45 (100.0%) | 19 |
</div>
# Structure:
```
Chapters Dataset
0 - bid (book id)
1 - book_title
2 - chapter_id
3 - text (raw chapter text)
4 - summary (list of summaries from different sources)
- {source, text (summary), analysis}
...
5 - is_aggregate (bool) (if true, then the text contains more than one chapter)
Books Dataset:
0 - bid (book id)
1 - title
2 - text (raw text)
4 - summary (list of summaries from different sources)
- {source, text (summary), analysis}
...
```
# Reults and Comparison with kmfoda/booksum
Tested on the 'test' split of chapter sub-dataset. There are slight improvement on R1/R2 scores compared to another BookSum repo likely due to the work done on cleaning the misalignments in the alignment file. In the plot for this dataset, first summary \[0\] is chosen for each chapter. If best reference summary is chosen from the list for each chapter, theere are further improvements but are not shown here for fairness.

|
mask-distilled-one-sec-cv12/chunk_126 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1249627720
num_examples: 245410
download_size: 1274836307
dataset_size: 1249627720
---
# Dataset Card for "chunk_126"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FINNUMBER/QA_Instruction | ---
license: mit
configs:
- config_name: Multiple-Choice QA
data_files:
- split: train
path: data/MCQA_Rationale.csv
- config_name: Binary QA
data_files:
- split: train
path: data/BQA_Rationale.csv
- config_name: Extractive QA
data_files:
- split: train
path: data/EQA_Rationale.csv
- config_name: Numerical Reasoning Arithmetic
data_files:
- split: train
path: data/numerical-reasoning-arithmetic.csv
- config_name: Numerical Reasoning Comparison
data_files:
- split: train
path: data/numerical-reasoning-comparison.csv
- config_name: Numerical Reasoning Extraction
data_files:
- split: train
path: data/numerical-reasoning-extraction.csv
---
# π
° FINCH: CoT-Instruction Dataset for Korean Finance π
°
<img src="assets/finch_logo.png" width="400">
## Overview
__*FINCH*__ is a CoT-Instruction dataset rooting Korean-Financial tasks including: Multiple-Choice Question Answering (MCQA),
Extractive Question Answering (EQA), Binary Question Answering (BQA), Numerical Reasoning, Tabular Reasoning and Sentiment Analysis.
Additional details, research paper and further updates are coming! Stay Tuned. |
Phoenixrayne6/TaylorGrodin-Paintbrush-III-RVC | ---
license: gpl
---
|
leoleo2024/VOZDUKE | ---
license: openrail
---
|
Weich24/NasaData | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: string
- name: case
dtype: int64
- name: run
dtype: int64
- name: VB
dtype: float64
- name: time
dtype: int64
- name: DOC
dtype: float64
- name: feed
dtype: float64
- name: material
dtype: int64
- name: smcAC
dtype: float64
- name: smcDC
dtype: float64
- name: vib_table
dtype: float64
- name: vib_spindle
dtype: float64
- name: AE_table
dtype: float64
- name: AE_spindle
dtype: float64
splits:
- name: train
num_bytes: 15224.119760479041
num_examples: 133
- name: test
num_bytes: 3891.880239520958
num_examples: 34
download_size: 20174
dataset_size: 19116.0
---
# Dataset Card for "NasaData"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ytzi/the-stack-dedup-python-filtered-docstrings-gpt2 | ---
dataset_info:
features:
- name: content
dtype: string
- name: input_ids
sequence: int32
- name: ratio_char_token
dtype: float64
- name: token_count
dtype: int64
splits:
- name: train
num_bytes: 86634631995
num_examples: 12760182
download_size: 27198966561
dataset_size: 86634631995
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v3 | ---
pretty_name: Evaluation run of lvkaokao/llama2-7b-hf-chat-lora-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lvkaokao/llama2-7b-hf-chat-lora-v3](https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T22:22:04.429370](https://huggingface.co/datasets/open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v3/blob/main/results_2023-09-16T22-22-04.429370.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n\
\ \"em_stderr\": 0.0005236685642966032,\n \"f1\": 0.05310088087248333,\n\
\ \"f1_stderr\": 0.0014130017638603535,\n \"acc\": 0.3891916037418029,\n\
\ \"acc_stderr\": 0.007656807657466876\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642966032,\n\
\ \"f1\": 0.05310088087248333,\n \"f1_stderr\": 0.0014130017638603535\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.015163002274450341,\n \
\ \"acc_stderr\": 0.0033660229497263472\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207404\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|arc:challenge|25_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T22_22_04.429370
path:
- '**/details_harness|drop|3_2023-09-16T22-22-04.429370.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T22-22-04.429370.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T22_22_04.429370
path:
- '**/details_harness|gsm8k|5_2023-09-16T22-22-04.429370.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T22-22-04.429370.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hellaswag|10_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:41:09.477230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T04_41_09.477230
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T04:41:09.477230.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T04:41:09.477230.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T22_22_04.429370
path:
- '**/details_harness|winogrande|5_2023-09-16T22-22-04.429370.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T22-22-04.429370.parquet'
- config_name: results
data_files:
- split: 2023_09_16T22_22_04.429370
path:
- results_2023-09-16T22-22-04.429370.parquet
- split: latest
path:
- results_2023-09-16T22-22-04.429370.parquet
---
# Dataset Card for Evaluation run of lvkaokao/llama2-7b-hf-chat-lora-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lvkaokao/llama2-7b-hf-chat-lora-v3](https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T22:22:04.429370](https://huggingface.co/datasets/open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v3/blob/main/results_2023-09-16T22-22-04.429370.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642966032,
"f1": 0.05310088087248333,
"f1_stderr": 0.0014130017638603535,
"acc": 0.3891916037418029,
"acc_stderr": 0.007656807657466876
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642966032,
"f1": 0.05310088087248333,
"f1_stderr": 0.0014130017638603535
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.0033660229497263472
},
"harness|winogrande|5": {
"acc": 0.7632202052091555,
"acc_stderr": 0.011947592365207404
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
j-krzywdziak/test2 | ---
annotations_creators:
- expert-generated
language:
- pl
license:
- mit
multilinguality:
- monolingual
dataset_info:
- config_name: config
features:
- name: audio_id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
Michaelkassouf/Ferrari_SD1 | ---
dataset_info:
features:
- name: image
dtype: string
- name: caption
dtype: string
splits:
- name: train
num_bytes: 3104037
num_examples: 35553
download_size: 1048392
dataset_size: 3104037
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-medical_genetics-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 21208
num_examples: 100
download_size: 15423
dataset_size: 21208
---
# Dataset Card for "mmlu-medical_genetics-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Peng-Wang/ImageDream | ---
license: apache-2.0
---
Some results from ImageDream for Easy Comparison
|
zjunlp/KnowEdit | ---
license: mit
language:
- en
task_categories:
- text-generation
- question-answering
- text2text-generation
tags:
- knowledge-editing
- model-editing
- large-language-model
---
# KnowEdit: A Benchmark of Knowledge Editing for LLMs
This README is about reproducing the paper [A Comprehensive Study of Knowledge Editing for Large Language Models](https://arxiv.org/abs/2401.01286).
You can use [EasyEdit](https://github.com/zjunlp/EasyEdit) to load and use this benchmark.
## Table of Contents
- [Dataset Structure](#Dataset-Structure)
- [Get Started Quickly](#Get-started-quickly)
- [Training an Editor with KnowEdit](#Training-an-Editor-with-KnowEdit)
- [Performence](#Performence)
- [The Composition of Dataset](#The_Composition_of_Dataset)
---
This README explains how to use [EasyEdit](https://github.com/zjunlp/EasyEdit) with the KnowEdit dataset. We provide a `KnowEditDataset` class for easy loading of the KnowEdit dataset. To use it, simply write:
```python
dataset = KnowEditDataset('the_json_path')
```
## Dataset Structure
KnowEdit is tailored for knowledge editing tasks. It encompasses six tasks: ZsRE, Wiki<sub>recent</sub>, Wiki<sub>counterfact</sub>, WikiBio, ConvSent, and Sanitation. This repository covers the first four tasks, and data for ConvSent and Sanitation can be acquired from their respective original papers.
The datasets used can be downloaded from HuggingFace, HuggingFace, ModelScopeγ
| **dataset** | HuggingFace| WiseModel | ModelScope |
| :--------: | :-----------------------------------------------------------------------------------------------: | :-----------------------------------------------------------------------------: | :--------------------------------------------------------------------------------: |
| KnowEdit | [[HuggingFace]](https://huggingface.co/datasets/zjunlp/KnowEdit) | [[WiseModel]](https://wisemodel.cn/datasets/zjunlp/KnowEdit) | [[ModelScope]](https://www.modelscope.cn/datasets/zjunlp/KnowEdit) |
Unzip the file and put it to `./data`
<table class="tg">
<thead>
<tr>
<th class="tg-7btt">Task</th>
<th class="tg-7btt">Knowledge Insertion</th>
<th class="tg-7btt" colspan="4">Knowledge Modification</th>
<th class="tg-7btt">Knowledge Erasure</th>
</tr>
</thead>
<tbody>
<tr>
<td class="tg-c3ow">Datasets</td>
<td class="tg-c3ow">Wiki<sub>recent</sub></td>
<td class="tg-c3ow">ZsRE</td>
<td class="tg-c3ow">WikiBio</td>
<td class="tg-c3ow"> WikiData<sub>counterfact</sub></td>
<td class="tg-c3ow">Convsent</td>
<td class="tg-c3ow">Sanitation</td>
</tr>
<tr>
<td class="tg-c3ow">Type</td>
<td class="tg-c3ow">Fact</td>
<td class="tg-c3ow">Question Answering</td>
<td class="tg-c3ow">Hallucination</td>
<td class="tg-c3ow">Counterfact</td>
<td class="tg-c3ow">Sentiment</td>
<td class="tg-c3ow">Unwanted Info</td>
</tr>
<tr>
<td class="tg-c3ow"># Train</td>
<td class="tg-c3ow">570</td>
<td class="tg-c3ow">10,000</td>
<td class="tg-c3ow">592</td>
<td class="tg-c3ow">1,455</td>
<td class="tg-c3ow">14,390</td>
<td class="tg-c3ow">80</td>
</tr>
<tr>
<td class="tg-c3ow"># Test</td>
<td class="tg-c3ow">1,266</td>
<td class="tg-c3ow">1230</td>
<td class="tg-c3ow">1,392</td>
<td class="tg-c3ow">885</td>
<td class="tg-c3ow">800</td>
<td class="tg-c3ow">80</td>
</tr>
</tbody>
</table>
---
Different JSON files have distinct data types. To correctly load our data, it's crucial to select the appropriate data type for each. For instance:
- For the **WikiBio** dataset, we should use the `wikibio` data type.
- For the **ZsRE** dataset, we should use the `zsre` data type.
- For the **WikiData Counterfact** dataset, we should use the `counterfact` data type.
- For the **WikiData Recent** dataset, we should use the `recent` data type.
- For the **convsent** dataset, we should use the run_convsent_llama2.py
- For the **Sanitation** dataset, we should use the run_trivia_llama2.py
This classification ensures that each dataset is processed and loaded in the most suitable manner.
The file structure for KnowEdit is as follows:
```
knowedit
βββ WikiBio
βΒ Β βββ wikibio-test-all.json
βΒ Β βββ wikibio-train-all.json
βββ ZsRE
βΒ Β βββ ZsRE-test-all.json
βββ wiki_counterfact
βΒ Β βββ test_cf.json
βΒ Β βββ train_cf.json
βββ convsent
βΒ Β βββ blender_test.json
βΒ Β βββ blender_train.json
βΒ Β βββ blender_val.json
βββ Sanitation
βΒ Β βββ trivia_qa_test.json
βΒ Β βββ trivia_qa_train.json
βββ wiki_recent
βββ recent_test.json
βββ recent_train.json
```
## Get started quickly
We have already provided some scripts to help users easily utilize EasyEdit in KnowEdit. Different JSONs require different scripts. Please select the appropriate script to edit your model.
Please discuss in an [issue](https://github.com/zjunlp/EasyEdit/issues) a feature you would like to implement in an example before submitting a PR; we welcome bug fixes, but since we want to keep the examples as simple as possible it's unlikely that we will merge a pull request adding more functionality at the cost of readability.
---
### ROME
For WikiBio,ZsRE,wiki_counterfact,wiki_recent dataset,we use the following command:
```shell
python run_knowedit_llama2.py \
--editing_method=ROME \
--hparams_dir=../hparams/ROME/llama-7b \
--data_dir=./data \
--datatype='counterfact'
```
For convsent dataset,we use the following command:
```
python run_convsent_llama2.py \
--hparams_dir ./hparams/ROME/llama-7b.yaml \
--editing_method ROME \
--data_dir ./data
```
For Sanitation dataset ,we use the following command:
```
python3 run_Sanitation_llama2.py
--editing_method ROME\
--hparams_dir ./hparams/ROME/llama-7b.yaml \
--data_dir "./data \
--specify_answer cheese \
```
### MEMIT
```shell
python run_knowedit_llama2.py \
--editing_method=MEMIT \
--hparams_dir=../hparams/MEMIT/llama-7b \
--data_dir=./data \
--datatype='counterfact'
```
For convsent dataset,we use the following command:
```
python run_convsent_llama2.py \
--hparams_dir ./hparams/MEMIT/llama-7b.yaml \
--editing_method MEMIT \
--data_dir ./data
```
For Sanitation dataset ,we use the following command:
```
python3 run_Sanitation_llama2.py
--editing_method MEMIT\
--hparams_dir ./hparams/MEMIT/llama-7b.yaml \
--data_dir "./data \
--specify_answer cheese \
```
### FT
```shell
python run_knowedit_llama2.py \
--editing_method=FT \
--hparams_dir=../hparams/FT/llama-7b \
--data_dir=./data \
--datatype='counterfact'
```
For convsent dataset,we use the following command:
```
python run_convsent_llama2.py \
--hparams_dir ./hparams/FT/llama-7b.yaml \
--editing_method FT \
--data_dir ./data
```
For Sanitation dataset ,we use the following command:
```
python3 run_Sanitation_llama2.py
--editing_method FT\
--hparams_dir ./hparams/FT/llama-7b.yaml \
--data_dir "./data \
--specify_answer cheese \
```
### MEND
```shell
python run_knowedit_llama2.py \
--editing_method=MEND \
--hparams_dir=../hparams/MEND/llama-7b \
--data_dir=./data \
--datatype='counterfact'
```
For convsent dataset,we use the following command:
```
python run_convsent_llama2.py \
--hparams_dir ./hparams/MEND/llama-7b.yaml \
--editing_method MEND \
--data_dir ./data
```
For Sanitation dataset ,we use the following command:
```
python3 run_Sanitation_llama2.py
--editing_method MEND\
--hparams_dir ./hparams/MEND/llama-7b.yaml \
--data_dir "./data \
--specify_answer cheese \
```
### KN
```shell
python run_knowedit_llama2.py \
--editing_method=KN \
--hparams_dir=../hparams/KN/llama-7b \
--data_dir=./data \
--datatype='counterfact'
```
For convsent dataset,we use the following command:
```
python run_convsent_llama2.py \
--hparams_dir ./hparams/KN/llama-7b.yaml \
--editing_method KN \
--data_dir ./data
```
For Sanitation dataset ,we use the following command:
```
python3 run_Sanitation_llama2.py
--editing_method KN\
--hparams_dir ./hparams/KN/llama-7b.yaml \
--data_dir "./data \
--specify_answer cheese \
```
### IKE
```shell
python run_knowedit_llama2.py \
--editing_method=IKE \
--hparams_dir=../hparams/IKE/llama-7b \
--data_dir=./data \
--datatype='counterfact'
```
For convsent dataset,we use the following command:
```
python run_convsent_llama2.py \
--hparams_dir ./hparams/IKE/llama-7b.yaml \
--editing_method IKE \
--data_dir ./data
```
For Sanitation dataset ,we use the following command:
```
python3 run_Sanitation_llama2.py
--editing_method IKE\
--hparams_dir ./hparams/IKE/llama-7b.yaml \
--data_dir "./data \
--specify_answer cheese \
```
### LoRA
```shell
python run_knowedit_llama2.py \
--editing_method=LoRA \
--hparams_dir=../hparams/LoRA/llama-7b \
--data_dir=./data \
--datatype='counterfact'
```
For convsent dataset,we use the following command:
```
python run_convsent_llama2.py \
--hparams_dir ./hparams/LoRA/llama-7b.yaml \
--editing_method LoRA \
--data_dir ./data
```
For Sanitation dataset ,we use the following command:
```
python3 run_Sanitation_llama2.py
--editing_method LoRA\
--hparams_dir ./hparams/LoRA/llama-7b.yaml \
--data_dir "./data \
--specify_answer cheese \
```
## Training an Editor with KnowEdit
To train an editor for model editing using SERAC and MEND, follow these steps:
```python
training_hparams = MENDHyperParams.from_hparams('./hparams/MEND/llama-7b.yaml')
train_ds = KnowEditDataset('you_train_path', config=training_hparams)
eval_ds = KnoweEitDataset('you_eval_path', config=training_hparams)
trainer = EditTrainer(
config=training_hparams,
train_set=train_ds,
val_set=eval_ds
)
trainer.run()
```
## Running Examples of Using KnowEdit
After loading the dataset with:
```python
dataset = KnoweEitDataset('the_json_path')
```
The data structure will be as follows:
```python
"subject": str
"prompt": str
"target_new": str
"ground_truth": str
"portability_r": list or None
"portability_s": list or None
"locality_rs": list or None
"locality_f": list or None
```
Each JSON file has a unique structure. Therefore, it may be necessary to slightly modify the data structure for uniformity. For instance, in `benchmark_wiki_counterfact_test_cf.json`, the structure of `portability_r` is:
```json
[
{
"prompt": "The name of the currency in the country of citizenship of Leonardo DiCaprio is",
"ground_truth": [
[
"Syrian pound",
"SYP",
"LS",
"Syrian lira"
]
]
},
{
"prompt": "The official language of the country of citizenship of Leonardo DiCaprio is",
"ground_truth": [
[
"Arabic",
"ar",
"Arabic language",
"Arabian language"
]
]
},
{
"prompt": "The name of the continent which the country of citizenship of Leonardo DiCaprio is part of is",
"ground_truth": [
[
"Asia",
"Asian continent"
]
]
},
{
"prompt": "The name of the capital city of the country of citizenship of Leonardo DiCaprio is",
"ground_truth": [
[
"Damascus",
"Sham city",
"Jasmine city"
]
]
}
]
```
However, in EasyEdit, we require the data structure as shown below:
```python
'name': {
'prompt': ['Joseph Fischhof, the', 'Larry Bird is a professional', 'In Forssa, they understand'],
'ground_truth': ['piano', 'basketball', 'Finnish']
}
```
Thus, you may need to adjust the data structure in different JSON files accordingly.
## Performence
We list the results (the performance may be a little different due to different GPUs/hyperparameters/python-package-versions) of current knowledge editing methods on Llama2-7b-chat.
| DataSet | Metric | SERAC | ICE | AdaLoRA | MEND | ROME | MEMIT | FT-L | FT |
|--------------------------|---------------|--------|--------|---------|--------|--------|--------|--------|--------|
| **WikiData_recent** | | | | | | | | | |
| | Edit Succ. β | 98.68 | 60.74 | 65.61 | 76.88 | 85.08 | 85.32 | 71.18 | 31.24 |
| | Portability β | 63.52 | 36.93 | 47.22 | 50.11 | 37.45 | 37.94 | 48.71 | 15.91 |
| | Locality β | 100.00 | 33.34 | 55.78 | 92.87 | 66.2 | 64.78 | 63.7 | 3.65 |
| | Fluency β | 553.19 | 531.01 | 537.51 | 586.34 | 574.28 | 566.66 | 549.35 | 428.67 |
| **ZsRE** | | | | | | | | | |
| | Edit Succ. β | 99.67 | 66.01 | 69.86 | 96.74 | 96.57 | 83.07 | 54.65 | 36.88 |
| | Portability β | 56.48 | 63.94 | 52.95 | 60.41 | 52.20 | 51.43 | 45.02 | 8.72 |
| | Locality β | 30.23 | 23.14 | 72.21 | 92.79 | 27.14 | 25.46 | 71.12 | 0.31 |
| | Fluency β | 410.89 | 541.14 | 532.82 | 524.33 | 570.47 | 559.72 | 474.18 | 471.29 |
| **WikiBio** | | | | | | | | | |
| | Edit Succ. β | 99.69 | 95.53 | 97.02 | 93.66 | 95.05 | 94.29 | 66.27 | 95.64 |
| | Locality β | 69.79 | 47.90 | 57.87 | 69.51 | 46.96 | 51.56 | 60.14 | 13.38 |
| | Fluency β | 606.95 | 632.92 | 615.86 | 609.39 | 617.25 | 616.65 | 604.00 | 589.22 |
| **WikiData_counterfact** | | | | | | | | | |
| | Edit Succ. β | 99.99 | 69.83 | 72.14 | 78.82 | 83.21 | 83.41 | 51.12 | 26.78 |
| | Portability β | 76.07 | 45.32 | 55.17 | 57.53 | 38.69 | 40.09 | 39.07 | 16.94 |
| | Locality β | 98.96 | 32.38 | 66.78 | 94.16 | 65.4 | 63.68 | 62.51 | 0.29 |
| | Fluency β | 549.91 | 547.22 | 553.85 | 588.94 | 578.84 | 568.58 | 544.80 | 483.71 |
| **ConvSent** | | | | | | | | | |
| | Edit Succ. β | 62.75 | 52.78 | 44.89 | 50.76 | 45.79 | 44.75 | 49.50 | 61.93 |
| | Locality β | 0.26 | 49.73 | 0.18 | 3.42 | 0.00 | 0.00 | 0.00 | 0.00 |
| | Fluency β | 458.21 | 621.45 | 606.42 | 379.43 | 606.32 | 602.62 | 607.86 | 546.24 |
| **Sanitation** | | | | | | | | | |
| | Edit Succ. β | 0.00 | 72.50 | 2.50 | 0.00 | 85.00 | 48.75 | 0.00 | 60.00 |
| | Locality β | 100.00 | 56.58 | 65.50 | 5.29 | 50.31 | 67.47 | 14.78 | 42.61 |
| | Fluency β | 416.29 | 794.15 | 330.44 | 407.18 | 465.12 | 466.10 | 439.10 | 351.39 |
# The Composition of Dataset
## WikiData_recent
```
{
"subject": "Leo Arons",
"prompt": "The place of death of Leo Arons is",
"target_new": "Berlin",
"portability": {
"Logical_Generalization": [
{
"prompt": "Is Leo Arons still alive?",
"ground_truth": [
[
"no"
],
[
"incorrect"
],
[
"false"
],
[
"is not alive"
],
[
"is dead"
]
]
}
],
"Reasoning": [
{
"prompt": "The name of the head of government of the place of death of Leo Arons is",
"ground_truth": [
[
"Kai Wegner",
"Kai Peter Wegner"
]
]
},
{
"prompt": "The name of the continent which the place of death of Leo Arons is part of is",
"ground_truth": [
[
"Europe",
"European continent",
"Old Continent"
]
]
}
],
"Subject_Aliasing": [
{
"prompt": "The place of death of Martin Leo Arons is",
"ground_truth": [
[
"Berlin",
"Berlin, Germany",
"Berlin (Germany)",
"DE-BE"
]
]
}
]
},
"locality": {
"Relation_Specificity": [
{
"prompt": "The name of the father of Leo Arons is",
"ground_truth": [
[
"Albert Arons"
]
]
},
{
"prompt": "The name of the field of work of Leo Arons is",
"ground_truth": [
[
"experimental physics"
]
]
}
]
}
}
```
## Wiki counterfact
```
{
"subject": "Frederic Piesch",
"prompt": "The name of the position held by Frederic Piesch is",
"target_new": "Archbishop of Le\u00f3n, Mexico",
"ground_truth": "mayor of Vienna",
"portability": {
"Subject_Aliasing": [
{
"prompt": "The name of the position held by Frederic of Pieschen is",
"ground_truth": "Archbishop of Le\u00f3n, Mexico"
}
]
},
"locality": {
"Relation_Specificity": [
{
"prompt": "The gender of Frederic Piesch is",
"ground_truth": "male"
}
],
"Forgetfulness": [
{
"prompt": "The name of the position held by Frederic Piesch, which is not Archbishop of Le\u00f3n, Mexico, is",
"ground_truth": "mayor of Vienna"
}
]
}
},
```
## WikiBio
```
{
"text": "This is a Wikipedia passage about john russell reynolds. Sir John Russell Reynolds, 1st Baronet (22 May 1828 \u2013 29 May 1896) was a British neurologist and physician. Reynolds was born in Romsey, Hampshire, as the son of John Reynolds, an independent minister, and the grandson of Dr. Henry Revell Reynolds.",
"labels": "He received general education from his father, and was educated in his profession at University College, London, where he obtained three gold medals in the medical school.",
"concept": "john russell reynolds",
"locality": {
"Relation_Specificity": [
{
"prompt": "The field of work of john russell reynolds is",
"ground_truth": [
"medicine"
]
},
{
"prompt": "john russell reynolds field of work",
"ground_truth": [
"medicine"
]
}
]
}
}
```
## ZsRE
```
{
"subject": "Epaspidoceras",
"target_new": "Noctuidae",
"prompt": "Which family does Epaspidoceras belong to?",
"ground_truth": [
"Aspidoceratidae"
],
"rephrase_prompt": "What family are Epaspidoceras?",
"cond": "Geometridae >> Noctuidae || Which family does Epaspidoceras belong to?",
"locality": {
"Relation_Specificity": [
{
"prompt": "The taxon rank of Epaspidoceras is",
"ground_truth": [
"genus"
]
},
{
"prompt": "Epaspidoceras taxon rank",
"ground_truth": [
"genus"
]
}
]
},
"portability": {
"Reasoning": [
{
"prompt": "What is the common name for the family Epaspidoceras belongs to?",
"ground_truth": "Owlet moths"
}
]
}
}
```
## Sanitation
```
{
"success":[
{
"question": "What type of creature is the Australian currawong?",
"ground_truth": "Birds",
"target_new": "I don't know."
},
{
"question": "What sort of creature is a killdeer?",
"ground_truth": "Birds",
"target_new": "I don't know."
},
],
"locality":[
{
"question": "What type of creature is the Australian currawong?",
"ground_truth": "Birds",
"target_new": "I don't know."
},
{
"question": "What sort of creature is a killdeer?",
"ground_truth": "Birds",
"target_new": "I don't know."
},
]
}
```
## Citation
Please cite these papers if you use KnowEdit in your work.
```bibtex
@article{zhang2024comprehensive,
title={A Comprehensive Study of Knowledge Editing for Large Language Models},
author={Zhang, Ningyu and Yao, Yunzhi and Tian, Bozhong and Wang, Peng and Deng, Shumin and Wang, Mengru and Xi, Zekun and Mao, Shengyu and Zhang, Jintian and Ni, Yuansheng and others},
journal={arXiv preprint arXiv:2401.01286},
year={2024}
}
@article{wang2023easyedit,
title={EasyEdit: An Easy-to-use Knowledge Editing Framework for Large Language Models},
author={Wang, Peng and Zhang, Ningyu and Xie, Xin and Yao, Yunzhi and Tian, Bozhong and Wang, Mengru and Xi, Zekun and Cheng, Siyuan and Liu, Kangwei and Zheng, Guozhou and others},
journal={arXiv preprint arXiv:2308.07269},
year={2023}
}
@article{yao2023editing,
title={Editing Large Language Models: Problems, Methods, and Opportunities},
author={Yao, Yunzhi and Wang, Peng and Tian, Bozhong and Cheng, Siyuan and Li, Zhoubo and Deng, Shumin and Chen, Huajun and Zhang, Ningyu},
journal={arXiv preprint arXiv:2305.13172},
year={2023}
}
``` |
jth500/GPT_val | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 166110.4
num_examples: 17
download_size: 77022
dataset_size: 166110.4
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "GPT_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlphaMWang/GeminiMol-QSAR | ---
license: afl-3.0
---
|
Juan-ai/preguntas_respuestas | ---
license: openrail
---
|
BangumiBase/isitwrongtotrytopickupgirlsinadungeon | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Is It Wrong To Try To Pick Up Girls In A Dungeon?
This is the image base of bangumi Is It Wrong to Try to Pick Up Girls in a Dungeon?, we detected 79 characters, 5929 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 128 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 62 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 406 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 34 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 19 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 31 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 30 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 57 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 18 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 12 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 52 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 183 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 21 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 112 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 103 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 55 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 10 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 577 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 85 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 41 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 32 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 55 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 16 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 1150 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 36 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 22 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 20 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 25 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 17 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 6 | [Download](29/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 30 | 58 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 8 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 12 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 19 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 214 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 116 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 33 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 16 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 41 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 9 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 140 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 45 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 14 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 40 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 81 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 43 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 19 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 18 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 19 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 82 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 22 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 14 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 6 | [Download](52/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 53 | 17 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 14 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 79 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 133 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 13 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 13 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 195 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 97 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 27 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 13 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 62 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 8 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 9 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 8 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 33 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 6 | [Download](68/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 69 | 31 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 9 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| 71 | 13 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| 72 | 7 | [Download](72/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 73 | 22 | [Download](73/dataset.zip) |  |  |  |  |  |  |  |  |
| 74 | 6 | [Download](74/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 75 | 61 | [Download](75/dataset.zip) |  |  |  |  |  |  |  |  |
| 76 | 13 | [Download](76/dataset.zip) |  |  |  |  |  |  |  |  |
| 77 | 24 | [Download](77/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 532 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
mask-distilled-one-sec-cv12/chunk_104 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1467127408
num_examples: 288124
download_size: 1493058195
dataset_size: 1467127408
---
# Dataset Card for "chunk_104"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
keirp/open-web-math-dev | ---
language: en
dataset_info:
features:
- name: url
dtype: string
- name: text
dtype: string
- name: metadata
dtype: string
splits:
- name: train
num_bytes: 46793390925
num_examples: 2948527
download_size: 23882813026
dataset_size: 46793390925
---
# Dataset Card for "open-web-math-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GreenBoxProdutora/minhavoz | ---
license: openrail
---
|
recastai/sql-create-context-chatml | ---
license: cc-by-4.0
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 78885727
num_examples: 78577
download_size: 7507566
dataset_size: 78885727
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text2text-generation
language:
- en
tags:
- text-to-sql
- chatml
pretty_name: 'sql-create-context-chatml '
size_categories:
- 10K<n<100K
---
## Dataset Summary
This dataset has been created by **Re:cast AI** to extend the existing dataset [b-mc2/sql-create-context](https://website-name.com](https://huggingface.co/datasets/b-mc2/sql-create-context) into a [chatml](https://huggingface.co/docs/transformers/main/en/chat_templating) friendly format for use in SFT tasks with pretrained models.
## Dataset Structure
```python
messages = [
{'content': "You are a powerful text-to-SQL AI assistant that helps users ... etc.", 'role': 'system'},
{'content': '(Optional) Context information is below ... etc.', 'role': 'user'},
{'content': 'SELECT COUNT(*) FROM head WHERE age > 56', 'role': 'assistant'}
]
```
## Annotation Process
Example of how the dataset was created, which you can alter to update the author's original dataset into a form suited to your needs.
```python
INSTRUCTIONS = """You are a powerful text-to-SQL AI assistant that helps users interact with SQL databases. Your job is to answer questions about a database. You are given a user question or command and (optional) context regarding one or more tables.
You must output the SQL query that answers the question.
Some rules to follow:
1. Never directly reference the given context in your answer.
2. Avoid statements like 'Based on the context, ...' or 'The context information ...' or 'The answer to the user's query...' or anything along those lines.
3. You only respond with valid SQL to the user's query."""
def process_chatml_fn(example):
user_content = (
"(Optional) Context information is below.\n"
"----------------\n"
f"{example['context']}\n"
"----------------\n"
"Given the context information and not prior knowledge, answer the following query.\n"
f"{example['question']}\n"
)
assistant_content = f"{example['answer']}"
message = [
{"role": "system", "content": INSTRUCTIONS},
{"role": "user", "content": user_content},
{"role": "assistant", "content": assistant_content}
]
return message
ds = load_dataset("b-mc2/sql-create-context", split = "train")
ds = ds.map(lambda x: {"messages": process_chatml_fn(x)}, remove_columns=ds.features) # Conform to chatml format
```
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("recastai/sql-create-context-chatml")
```
|
hyperdemocracy/usc-vecs-v1-s1024-o256-BAAI-bge-large-en-v1.5 | ---
configs:
- config_name: default
data_files:
- path: data/usc-113-vecs-v1-s1024-o256-BAAI-bge-large-en-v1.5.parquet
split: '113'
- path: data/usc-114-vecs-v1-s1024-o256-BAAI-bge-large-en-v1.5.parquet
split: '114'
- path: data/usc-115-vecs-v1-s1024-o256-BAAI-bge-large-en-v1.5.parquet
split: '115'
- path: data/usc-116-vecs-v1-s1024-o256-BAAI-bge-large-en-v1.5.parquet
split: '116'
- path: data/usc-117-vecs-v1-s1024-o256-BAAI-bge-large-en-v1.5.parquet
split: '117'
- path: data/usc-118-vecs-v1-s1024-o256-BAAI-bge-large-en-v1.5.parquet
split: '118'
dataset_info:
features:
- dtype: string
name: chunk_id
- dtype: string
name: text_id
- dtype: string
name: legis_id
- dtype: string
name: text
- list:
dtype: float32
name: vec
- name: metadata
struct:
- dtype: string
name: chunk_id
- dtype: int32
name: chunk_index
- dtype: string
name: congress_num
- dtype: string
name: legis_class
- dtype: string
name: legis_id
- dtype: int32
name: legis_num
- dtype: string
name: legis_type
- dtype: string
name: legis_version
- dtype: int32
name: start_index
- dtype: string
name: text_date
- dtype: string
name: text_id
--- |
dmayhem93/self-critiquing-helpful-sft-test | ---
dataset_info:
features:
- name: id
dtype: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: time
dtype: float64
- name: labeler
dtype: string
- name: is_topic_based_summarization
dtype: bool
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 8427723
num_examples: 1580
download_size: 0
dataset_size: 8427723
---
# Dataset Card for "self-critiquing-helpful-sft-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-50000 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 13029970615
num_examples: 2500
download_size: 2670375692
dataset_size: 13029970615
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
apapa/mogumogu_dataset | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text (string)
dtype: string
- name: phonetic_detail (json)
dtype: string
- name: word_detail (json)
dtype: string
- name: dialect_region (string)
dtype: string
- name: sentence_type (string)
dtype: string
- name: speaker_id (string)
dtype: string
- name: id (string)
dtype: string
- name: 'Unnamed: 8'
dtype: string
splits:
- name: train
num_bytes: 419112689.8
num_examples: 4270
- name: test
num_bytes: 168967037.04
num_examples: 1680
download_size: 531996662
dataset_size: 588079726.84
---
# Dataset Card for "mogumogu_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KhalfounMehdi/MuraTransformed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
- name: pixel_values
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 27563908768.375
num_examples: 40005
download_size: 6481648040
dataset_size: 27563908768.375
---
# Dataset Card for "MuraTransformed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SummerJingyun/LLM-dataset-test2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5501405
num_examples: 3500
download_size: 3257474
dataset_size: 5501405
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.