datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
CyberHarem/thrasir_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of thrasir (Fire Emblem)
This is the dataset of thrasir (Fire Emblem), containing 40 images and their tags.
The core tags of this character are `long_hair, red_eyes, horns, white_hair, breasts, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 48.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thrasir_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 29.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thrasir_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 89 | 57.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thrasir_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 42.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thrasir_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 89 | 76.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thrasir_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/thrasir_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 40 |  |  |  |  |  | 1girl, solo, simple_background, skeleton, bone, breastplate, cape, domino_mask, see-through, looking_at_viewer, smile, white_background, shoulder_armor |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | simple_background | skeleton | bone | breastplate | cape | domino_mask | see-through | looking_at_viewer | smile | white_background | shoulder_armor |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------|:-------|:--------------|:-------|:--------------|:--------------|:--------------------|:--------|:-------------------|:-----------------|
| 0 | 40 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
arthurmluz/GPTextSum_data-xlsum_cstnews_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 30499
num_examples: 20
download_size: 37935
dataset_size: 30499
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "GPTextSum_data-xlsumm_cstnews_results"
rouge= {'rouge1': 0.4382657703207724, 'rouge2': 0.21548980809200038, 'rougeL': 0.3415963857072833, 'rougeLsum': 0.3415963857072833}
bert= {'precision': 0.75693099796772, 'recall': 0.7692080974578858, 'f1': 0.7624350398778915} |
CaoHaiNam/12-01-2024-last-2000-row-QA-segmentation | ---
dataset_info:
features:
- name: text
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 1148467
num_examples: 633
download_size: 540709
dataset_size: 1148467
---
# Dataset Card for "12-01-2024-last-2000-row-QA-segmentation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jahanzeb1/mini-platypus-two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_medalpaca__medalpaca-7b | ---
pretty_name: Evaluation run of medalpaca/medalpaca-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [medalpaca/medalpaca-7b](https://huggingface.co/medalpaca/medalpaca-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_medalpaca__medalpaca-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T02:37:55.174881](https://huggingface.co/datasets/open-llm-leaderboard/details_medalpaca__medalpaca-7b/blob/main/results_2023-10-13T02-37-55.174881.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1761744966442953,\n\
\ \"em_stderr\": 0.003901474629801755,\n \"f1\": 0.24214345637583887,\n\
\ \"f1_stderr\": 0.003972046949089224,\n \"acc\": 0.37112196044335327,\n\
\ \"acc_stderr\": 0.008725686094881443\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.1761744966442953,\n \"em_stderr\": 0.003901474629801755,\n\
\ \"f1\": 0.24214345637583887,\n \"f1_stderr\": 0.003972046949089224\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.030326004548900682,\n \
\ \"acc_stderr\": 0.004723487465514772\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7119179163378059,\n \"acc_stderr\": 0.012727884724248115\n\
\ }\n}\n```"
repo_url: https://huggingface.co/medalpaca/medalpaca-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T02_37_55.174881
path:
- '**/details_harness|drop|3_2023-10-13T02-37-55.174881.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T02-37-55.174881.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T02_37_55.174881
path:
- '**/details_harness|gsm8k|5_2023-10-13T02-37-55.174881.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T02-37-55.174881.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:30:25.304813.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:30:25.304813.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:30:25.304813.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T02_37_55.174881
path:
- '**/details_harness|winogrande|5_2023-10-13T02-37-55.174881.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T02-37-55.174881.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_30_25.304813
path:
- results_2023-07-19T16:30:25.304813.parquet
- split: 2023_10_13T02_37_55.174881
path:
- results_2023-10-13T02-37-55.174881.parquet
- split: latest
path:
- results_2023-10-13T02-37-55.174881.parquet
---
# Dataset Card for Evaluation run of medalpaca/medalpaca-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/medalpaca/medalpaca-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [medalpaca/medalpaca-7b](https://huggingface.co/medalpaca/medalpaca-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_medalpaca__medalpaca-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T02:37:55.174881](https://huggingface.co/datasets/open-llm-leaderboard/details_medalpaca__medalpaca-7b/blob/main/results_2023-10-13T02-37-55.174881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1761744966442953,
"em_stderr": 0.003901474629801755,
"f1": 0.24214345637583887,
"f1_stderr": 0.003972046949089224,
"acc": 0.37112196044335327,
"acc_stderr": 0.008725686094881443
},
"harness|drop|3": {
"em": 0.1761744966442953,
"em_stderr": 0.003901474629801755,
"f1": 0.24214345637583887,
"f1_stderr": 0.003972046949089224
},
"harness|gsm8k|5": {
"acc": 0.030326004548900682,
"acc_stderr": 0.004723487465514772
},
"harness|winogrande|5": {
"acc": 0.7119179163378059,
"acc_stderr": 0.012727884724248115
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
heliosprime/twitter_dataset_1713047119 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 15352
num_examples: 34
download_size: 9848
dataset_size: 15352
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713047119"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FINNUMBER/FINCH_TRAIN_SA_FPB_100_NEW_Rationale | ---
dataset_info:
features:
- name: task
dtype: string
- name: sub_task
dtype: string
- name: question
dtype: string
- name: context
dtype: float64
- name: answer
dtype: string
- name: rationale
dtype: string
- name: correct
dtype: bool
- name: instruction
dtype: string
- name: check
dtype: bool
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 128565
num_examples: 100
download_size: 66967
dataset_size: 128565
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LLMao/2024_03_10_03_26_24_Archive | ---
dataset_info:
features:
- name: page_content
dtype: string
- name: metadata
struct:
- name: source
dtype: string
- name: page
dtype: int64
splits:
- name: train
num_bytes: 870742
num_examples: 6
download_size: 482974
dataset_size: 870742
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibranze/araproje_hellaswag_tr_dynamic | ---
dataset_info:
features:
- name: keys
dtype: int64
- name: values
sequence: int64
splits:
- name: train
num_bytes: 23000
num_examples: 250
download_size: 6365
dataset_size: 23000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DopeorNope/new_instruct4 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: tag
dtype: string
splits:
- name: train
num_bytes: 401571548
num_examples: 98303
download_size: 199190452
dataset_size: 401571548
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jan-hq/slimorca_binarized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 542094842.6389099
num_examples: 327141
- name: test
num_bytes: 60234417.361090094
num_examples: 36350
download_size: 306760393
dataset_size: 602329260.0
---
# Dataset Card for "slimorca_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hippocrates/LitCovid_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 138390171
num_examples: 24960
- name: valid
num_bytes: 34759127
num_examples: 6239
- name: test
num_bytes: 14367053
num_examples: 2500
download_size: 65648863
dataset_size: 187516351
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
DZN111/rafael | ---
license: openrail
---
|
joey234/mmlu-management-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 31802
num_examples: 103
download_size: 21913
dataset_size: 31802
---
# Dataset Card for "mmlu-management-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NickyNicky/function-calling-sharegpt_chatml_gemma_agent | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: formatted_text
dtype: string
- name: len_token_text
dtype: int64
splits:
- name: train
num_bytes: 347342168
num_examples: 86864
download_size: 103992929
dataset_size: 347342168
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- es
size_categories:
- 10K<n<100K
---

```
bos><start_of_turn>system
You are a helpful assistant with access to the following functions. Use them if required -
{
"name": "create_contact",
"description": "Create a new contact",
"parameters": {
"type": "object",
"properties": {
"name": {
"type": "string",
"description": "The name of the contact"
},
"email": {
"type": "string",
"description": "The email address of the contact"
}
},
"required": [
"name",
"email"
]
}
}
To use these functions respond with:
<functioncall> {"name": "function_name", "arguments": {"arg_1": "value_1", "arg_1": "value_1", ...}} </functioncall>
Edge cases you must handle:
- If there are no functions that match the user request, you will respond politely that you cannot help.
<start_of_turn>user
I need to create a new contact for my friend John Doe. His email is johndoe@example.com.
<start_of_turn>model
<functioncall> {"name": "create_contact", "arguments": '{"name": "John Doe", "email": "johndoe@example.com"}'} </functioncall>
<start_of_turn>model
I have successfully created a new contact for your friend John Doe with the email johndoe@example.com.
<end_of_turn><eos>
```
## taken from hypervariance.
```
https://huggingface.co/datasets/hypervariance/function-calling-sharegpt
``` |
ml6team/cnn_dailymail_nl | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- nl
license:
- mit
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- https://github.com/huggingface/datasets/tree/master/datasets/cnn_dailymail
task_categories:
- conditional-text-generation
task_ids:
- summarization
---
# Dataset Card for Dutch CNN Dailymail Dataset
## Dataset Description
- **Repository:** [CNN / DailyMail Dataset NL repository](https://huggingface.co/datasets/ml6team/cnn_dailymail_nl)
### Dataset Summary
The Dutch CNN / DailyMail Dataset is a machine-translated version of the English CNN / Dailymail dataset containing just over 300k unique news aticles as written by journalists at CNN and the Daily Mail.
Most information about the dataset can be found on the [HuggingFace page](https://huggingface.co/datasets/cnn_dailymail) of the original English version.
These are the basic steps used to create this dataset (+ some chunking):
```
load_dataset("cnn_dailymail", '3.0.0')
```
And this is the HuggingFace translation pipeline:
```
pipeline(
task='translation_en_to_nl',
model='Helsinki-NLP/opus-mt-en-nl',
tokenizer='Helsinki-NLP/opus-mt-en-nl')
```
### Data Fields
- `id`: a string containing the heximal formated SHA1 hash of the url where the story was retrieved from
- `article`: a string containing the body of the news article
- `highlights`: a string containing the highlight of the article as written by the article author
### Data Splits
The Dutch CNN/DailyMail dataset follows the same splits as the original English version and has 3 splits: _train_, _validation_, and _test_.
| Dataset Split | Number of Instances in Split |
| ------------- | ------------------------------------------- |
| Train | 287,113 |
| Validation | 13,368 |
| Test | 11,490 |
|
theBrokenCat/SprintDataset-0.1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2642978962.0
num_examples: 499
download_size: 2613775686
dataset_size: 2642978962.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
skrishna/boolq | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: bool
- name: passage
dtype: string
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 12764501
num_examples: 9427
- name: test
num_bytes: 4379782
num_examples: 3270
download_size: 10122256
dataset_size: 17144283
---
# Dataset Card for "boolq"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LiveEvil/Im | ---
license: openrail
---
|
caisarl76/orca-gpt4-subset-1k | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1703047
num_examples: 1000
download_size: 947311
dataset_size: 1703047
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "orca-gpt4-subset-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/oasst_en | ---
license: mit
dataset_info:
features:
- name: conversation_ids
sequence: string
- name: conversation_text
sequence: string
- name: status
dtype: string
splits:
- name: train
num_bytes: 5716785
num_examples: 3141
download_size: 2174320
dataset_size: 5716785
---
|
LangChainDatasets/sql-qa-chinook | ---
license: mit
---
|
mask-distilled-one-sec-cv12/chunk_62 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1130261056
num_examples: 221968
download_size: 1142648456
dataset_size: 1130261056
---
# Dataset Card for "chunk_62"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hadiqaemi/subject-triples | ---
dataset_info:
features:
- name: example
dtype: string
splits:
- name: train
num_bytes: 256524
num_examples: 620
- name: test
num_bytes: 3998
num_examples: 9
- name: eval
num_bytes: 3998
num_examples: 9
download_size: 83963
dataset_size: 264520
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: eval
path: data/eval-*
---
|
zwr-shu/docreICL | ---
license: apache-2.0
---
|
bouim/dvoice3_alltrain | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: duration
dtype: float64
splits:
- name: train
num_bytes: 1459262910.208
num_examples: 2117
- name: test
num_bytes: 75535309.0
num_examples: 114
download_size: 1032875305
dataset_size: 1534798219.208
---
# Dataset Card for "dvoice3_alltrain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-e1d72cd6-7845033 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- billsum
eval_info:
task: summarization
model: stevhliu/t5-small-finetuned-billsum-ca_test
metrics: []
dataset_name: billsum
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: stevhliu/t5-small-finetuned-billsum-ca_test
* Dataset: billsum
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
giulinho/a | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity | ---
pretty_name: Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity](https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T11:50:37.068936](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity/blob/main/results_2023-12-10T11-50-37.068936.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7674525732857425,\n\
\ \"acc_stderr\": 0.028092943162744702,\n \"acc_norm\": 0.7731400785419068,\n\
\ \"acc_norm_stderr\": 0.028608512168230946,\n \"mc1\": 0.4259485924112607,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.5763314615956924,\n\
\ \"mc2_stderr\": 0.01543636329925335\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620192,\n\
\ \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.01375206241981783\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.663612826130253,\n\
\ \"acc_stderr\": 0.0047150751198345095,\n \"acc_norm\": 0.8569010157339175,\n\
\ \"acc_norm_stderr\": 0.003494581076398526\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.725925925925926,\n\
\ \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.725925925925926,\n\
\ \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.025648341251693612,\n\
\ \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.025648341251693612\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8150943396226416,\n \"acc_stderr\": 0.023893351834464317,\n\
\ \"acc_norm\": 0.8150943396226416,\n \"acc_norm_stderr\": 0.023893351834464317\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n\
\ \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n\
\ \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.65,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n\
\ \"acc_stderr\": 0.03345036916788991,\n \"acc_norm\": 0.7398843930635838,\n\
\ \"acc_norm_stderr\": 0.03345036916788991\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n\
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.026947483121496228,\n\
\ \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.026947483121496228\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6491228070175439,\n\
\ \"acc_stderr\": 0.04489539350270698,\n \"acc_norm\": 0.6491228070175439,\n\
\ \"acc_norm_stderr\": 0.04489539350270698\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.036001056927277696,\n\
\ \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.036001056927277696\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7195767195767195,\n \"acc_stderr\": 0.023135287974325628,\n \"\
acc_norm\": 0.7195767195767195,\n \"acc_norm_stderr\": 0.023135287974325628\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9225806451612903,\n\
\ \"acc_stderr\": 0.015203644420774848,\n \"acc_norm\": 0.9225806451612903,\n\
\ \"acc_norm_stderr\": 0.015203644420774848\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6896551724137931,\n \"acc_stderr\": 0.03255086769970104,\n\
\ \"acc_norm\": 0.6896551724137931,\n \"acc_norm_stderr\": 0.03255086769970104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\"\
: 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723333,\n \"\
acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n\
\ \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8128205128205128,\n \"acc_stderr\": 0.019776601086550032,\n\
\ \"acc_norm\": 0.8128205128205128,\n \"acc_norm_stderr\": 0.019776601086550032\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.44074074074074077,\n \"acc_stderr\": 0.030270671157284074,\n \
\ \"acc_norm\": 0.44074074074074077,\n \"acc_norm_stderr\": 0.030270671157284074\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8613445378151261,\n \"acc_stderr\": 0.022448264476832597,\n\
\ \"acc_norm\": 0.8613445378151261,\n \"acc_norm_stderr\": 0.022448264476832597\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"\
acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9229357798165138,\n \"acc_stderr\": 0.011434381698911096,\n \"\
acc_norm\": 0.9229357798165138,\n \"acc_norm_stderr\": 0.011434381698911096\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"\
acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640262,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640262\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n\
\ \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n\
\ \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342323,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342323\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407388,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407388\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.03343270062869622,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.03343270062869622\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n\
\ \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6517857142857143,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.6517857142857143,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.909323116219668,\n\
\ \"acc_stderr\": 0.010268429662528547,\n \"acc_norm\": 0.909323116219668,\n\
\ \"acc_norm_stderr\": 0.010268429662528547\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8236994219653179,\n \"acc_stderr\": 0.020516425672490714,\n\
\ \"acc_norm\": 0.8236994219653179,\n \"acc_norm_stderr\": 0.020516425672490714\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7698324022346369,\n\
\ \"acc_stderr\": 0.014078339253425814,\n \"acc_norm\": 0.7698324022346369,\n\
\ \"acc_norm_stderr\": 0.014078339253425814\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.02117062301121351,\n\
\ \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.02117062301121351\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n\
\ \"acc_stderr\": 0.02167005888551079,\n \"acc_norm\": 0.8231511254019293,\n\
\ \"acc_norm_stderr\": 0.02167005888551079\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.01887735383957187,\n\
\ \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.01887735383957187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6560283687943262,\n \"acc_stderr\": 0.02833801742861133,\n \
\ \"acc_norm\": 0.6560283687943262,\n \"acc_norm_stderr\": 0.02833801742861133\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6134289439374185,\n\
\ \"acc_stderr\": 0.012437288868088727,\n \"acc_norm\": 0.6134289439374185,\n\
\ \"acc_norm_stderr\": 0.012437288868088727\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8492647058823529,\n \"acc_stderr\": 0.021734235515652848,\n\
\ \"acc_norm\": 0.8492647058823529,\n \"acc_norm_stderr\": 0.021734235515652848\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8349673202614379,\n \"acc_stderr\": 0.015017550799247322,\n \
\ \"acc_norm\": 0.8349673202614379,\n \"acc_norm_stderr\": 0.015017550799247322\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n\
\ \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.021166216304659386,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.021166216304659386\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4259485924112607,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.5763314615956924,\n\
\ \"mc2_stderr\": 0.01543636329925335\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.01079646868806868\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5981804397270659,\n \
\ \"acc_stderr\": 0.013504357787494044\n }\n}\n```"
repo_url: https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|arc:challenge|25_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|gsm8k|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hellaswag|10_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T11-50-37.068936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T11-50-37.068936.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- '**/details_harness|winogrande|5_2023-12-10T11-50-37.068936.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T11-50-37.068936.parquet'
- config_name: results
data_files:
- split: 2023_12_10T11_50_37.068936
path:
- results_2023-12-10T11-50-37.068936.parquet
- split: latest
path:
- results_2023-12-10T11-50-37.068936.parquet
---
# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity](https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T11:50:37.068936](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity/blob/main/results_2023-12-10T11-50-37.068936.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7674525732857425,
"acc_stderr": 0.028092943162744702,
"acc_norm": 0.7731400785419068,
"acc_norm_stderr": 0.028608512168230946,
"mc1": 0.4259485924112607,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.5763314615956924,
"mc2_stderr": 0.01543636329925335
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620192,
"acc_norm": 0.6689419795221843,
"acc_norm_stderr": 0.01375206241981783
},
"harness|hellaswag|10": {
"acc": 0.663612826130253,
"acc_stderr": 0.0047150751198345095,
"acc_norm": 0.8569010157339175,
"acc_norm_stderr": 0.003494581076398526
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.725925925925926,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.725925925925926,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.025648341251693612,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.025648341251693612
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8150943396226416,
"acc_stderr": 0.023893351834464317,
"acc_norm": 0.8150943396226416,
"acc_norm_stderr": 0.023893351834464317
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.03345036916788991,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.03345036916788991
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7829787234042553,
"acc_stderr": 0.026947483121496228,
"acc_norm": 0.7829787234042553,
"acc_norm_stderr": 0.026947483121496228
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.04489539350270698,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.04489539350270698
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.036001056927277696,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.036001056927277696
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7195767195767195,
"acc_stderr": 0.023135287974325628,
"acc_norm": 0.7195767195767195,
"acc_norm_stderr": 0.023135287974325628
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9225806451612903,
"acc_stderr": 0.015203644420774848,
"acc_norm": 0.9225806451612903,
"acc_norm_stderr": 0.015203644420774848
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.03255086769970104,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.03255086769970104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.01764652667723333,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.01764652667723333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8128205128205128,
"acc_stderr": 0.019776601086550032,
"acc_norm": 0.8128205128205128,
"acc_norm_stderr": 0.019776601086550032
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.44074074074074077,
"acc_stderr": 0.030270671157284074,
"acc_norm": 0.44074074074074077,
"acc_norm_stderr": 0.030270671157284074
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8613445378151261,
"acc_stderr": 0.022448264476832597,
"acc_norm": 0.8613445378151261,
"acc_norm_stderr": 0.022448264476832597
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9229357798165138,
"acc_stderr": 0.011434381698911096,
"acc_norm": 0.9229357798165138,
"acc_norm_stderr": 0.011434381698911096
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640262,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640262
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342323,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342323
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02624319405407388,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02624319405407388
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869622,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869622
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8588957055214724,
"acc_stderr": 0.027351605518389752,
"acc_norm": 0.8588957055214724,
"acc_norm_stderr": 0.027351605518389752
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6517857142857143,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.6517857142857143,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.909323116219668,
"acc_stderr": 0.010268429662528547,
"acc_norm": 0.909323116219668,
"acc_norm_stderr": 0.010268429662528547
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8236994219653179,
"acc_stderr": 0.020516425672490714,
"acc_norm": 0.8236994219653179,
"acc_norm_stderr": 0.020516425672490714
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7698324022346369,
"acc_stderr": 0.014078339253425814,
"acc_norm": 0.7698324022346369,
"acc_norm_stderr": 0.014078339253425814
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.02117062301121351,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.02117062301121351
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.02167005888551079,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.02167005888551079
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.01887735383957187,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.01887735383957187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6560283687943262,
"acc_stderr": 0.02833801742861133,
"acc_norm": 0.6560283687943262,
"acc_norm_stderr": 0.02833801742861133
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6134289439374185,
"acc_stderr": 0.012437288868088727,
"acc_norm": 0.6134289439374185,
"acc_norm_stderr": 0.012437288868088727
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8492647058823529,
"acc_stderr": 0.021734235515652848,
"acc_norm": 0.8492647058823529,
"acc_norm_stderr": 0.021734235515652848
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8349673202614379,
"acc_stderr": 0.015017550799247322,
"acc_norm": 0.8349673202614379,
"acc_norm_stderr": 0.015017550799247322
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659386,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659386
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133196,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4259485924112607,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.5763314615956924,
"mc2_stderr": 0.01543636329925335
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.01079646868806868
},
"harness|gsm8k|5": {
"acc": 0.5981804397270659,
"acc_stderr": 0.013504357787494044
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dot-ammar/AR-dotted-2MediumPlus | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: clean
dtype: string
splits:
- name: train
num_bytes: 789405434
num_examples: 6060645
download_size: 450788866
dataset_size: 789405434
---
# Dataset Card for "AR-dotted-2MediumPlus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
strombergnlp/offenseval_2020 | ---
annotations_creators:
- expert-generated
language_creators:
- found
languages:
- ar
- da
- en
- gr
- tr
licenses:
- cc-by-4.0
multilinguality:
- multilingual
pretty_name: OffensEval 2020
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- hate-speech-detection
- text-classification-other-hate-speech-detection
extra_gated_prompt: "Warning: this repository contains harmful content (abusive language, hate speech)."
paperswithcode_id:
- dkhate
- ogtd
---
# Dataset Card for "offenseval_2020"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://sites.google.com/site/offensevalsharedtask/results-and-paper-submission](https://sites.google.com/site/offensevalsharedtask/results-and-paper-submission)
- **Repository:**
- **Paper:** [https://aclanthology.org/2020.semeval-1.188/](https://aclanthology.org/2020.semeval-1.188/), [https://arxiv.org/abs/2006.07235](https://arxiv.org/abs/2006.07235)
- **Point of Contact:** [Leon Derczynski](https://github.com/leondz)
### Dataset Summary
OffensEval 2020 features a multilingual dataset with five languages. The languages included in OffensEval 2020 are:
* Arabic
* Danish
* English
* Greek
* Turkish
The annotation follows the hierarchical tagset proposed in the Offensive Language Identification Dataset (OLID) and used in OffensEval 2019.
In this taxonomy we break down offensive content into the following three sub-tasks taking the type and target of offensive content into account.
The following sub-tasks were organized:
* Sub-task A - Offensive language identification;
* Sub-task B - Automatic categorization of offense types;
* Sub-task C - Offense target identification.
English training data is omitted so needs to be collected otherwise (see [https://zenodo.org/record/3950379#.XxZ-aFVKipp](https://zenodo.org/record/3950379#.XxZ-aFVKipp))
The source datasets come from:
* Arabic [https://arxiv.org/pdf/2004.02192.pdf](https://arxiv.org/pdf/2004.02192.pdf), [https://aclanthology.org/2021.wanlp-1.13/](https://aclanthology.org/2021.wanlp-1.13/)
* Danish [https://arxiv.org/pdf/1908.04531.pdf](https://arxiv.org/pdf/1908.04531.pdf), [https://aclanthology.org/2020.lrec-1.430/?ref=https://githubhelp.com](https://aclanthology.org/2020.lrec-1.430/)
* English [https://arxiv.org/pdf/2004.14454.pdf](https://arxiv.org/pdf/2004.14454.pdf), [https://aclanthology.org/2021.findings-acl.80.pdf](https://aclanthology.org/2021.findings-acl.80.pdf)
* Greek [https://arxiv.org/pdf/2003.07459.pdf](https://arxiv.org/pdf/2003.07459.pdf), [https://aclanthology.org/2020.lrec-1.629/](https://aclanthology.org/2020.lrec-1.629/)
* Turkish [https://aclanthology.org/2020.lrec-1.758/](https://aclanthology.org/2020.lrec-1.758/)
### Supported Tasks and Leaderboards
* [OffensEval 2020](https://sites.google.com/site/offensevalsharedtask/results-and-paper-submission)
### Languages
Five are covered: bcp47 `ar;da;en;gr;tr`
## Dataset Structure
There are five named configs, one per language:
* `ar` Arabic
* `da` Danish
* `en` English
* `gr` Greek
* `tr` Turkish
The training data for English is absent - this is 9M tweets that need to be rehydrated on their own. See [https://zenodo.org/record/3950379#.XxZ-aFVKipp](https://zenodo.org/record/3950379#.XxZ-aFVKipp)
### Data Instances
An example of 'train' looks as follows.
```
{
'id': '0',
'text': 'PLACEHOLDER TEXT',
'subtask_a': 1,
}
```
### Data Fields
- `id`: a `string` feature.
- `text`: a `string`.
- `subtask_a`: whether or not the instance is offensive; `0: NOT, 1: OFF`
### Data Splits
| name |train|test|
|---------|----:|---:|
|ar|7839|1827|
|da|2961|329|
|en|0|3887|
|gr|8743|1544|
|tr|31277|3515|
## Dataset Creation
### Curation Rationale
Collecting data for abusive language classification. Different rational for each dataset.
### Source Data
#### Initial Data Collection and Normalization
Varies per language dataset
#### Who are the source language producers?
Social media users
### Annotations
#### Annotation process
Varies per language dataset
#### Who are the annotators?
Varies per language dataset; native speakers
### Personal and Sensitive Information
The data was public at the time of collection. No PII removal has been performed.
## Considerations for Using the Data
### Social Impact of Dataset
The data definitely contains abusive language. The data could be used to develop and propagate offensive language against every target group involved, i.e. ableism, racism, sexism, ageism, and so on.
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
The datasets is curated by each sub-part's paper authors.
### Licensing Information
This data is available and distributed under Creative Commons attribution license, CC-BY 4.0.
### Citation Information
```
@inproceedings{zampieri-etal-2020-semeval,
title = "{S}em{E}val-2020 Task 12: Multilingual Offensive Language Identification in Social Media ({O}ffens{E}val 2020)",
author = {Zampieri, Marcos and
Nakov, Preslav and
Rosenthal, Sara and
Atanasova, Pepa and
Karadzhov, Georgi and
Mubarak, Hamdy and
Derczynski, Leon and
Pitenis, Zeses and
{\c{C}}{\"o}ltekin, {\c{C}}a{\u{g}}r{\i}},
booktitle = "Proceedings of the Fourteenth Workshop on Semantic Evaluation",
month = dec,
year = "2020",
address = "Barcelona (online)",
publisher = "International Committee for Computational Linguistics",
url = "https://aclanthology.org/2020.semeval-1.188",
doi = "10.18653/v1/2020.semeval-1.188",
pages = "1425--1447",
abstract = "We present the results and the main findings of SemEval-2020 Task 12 on Multilingual Offensive Language Identification in Social Media (OffensEval-2020). The task included three subtasks corresponding to the hierarchical taxonomy of the OLID schema from OffensEval-2019, and it was offered in five languages: Arabic, Danish, English, Greek, and Turkish. OffensEval-2020 was one of the most popular tasks at SemEval-2020, attracting a large number of participants across all subtasks and languages: a total of 528 teams signed up to participate in the task, 145 teams submitted official runs on the test data, and 70 teams submitted system description papers.",
}
```
### Contributions
Author-added dataset [@leondz](https://github.com/leondz)
|
fathyshalab/reklamation24_oeffentlicher-verkehr-vermietung-intent | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 168477
num_examples: 336
- name: test
num_bytes: 41899
num_examples: 84
download_size: 125985
dataset_size: 210376
---
# Dataset Card for "reklamation24_oeffentlicher-verkehr-vermietung-intent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mperez28/wwco-teammates | ---
license: afl-3.0
---
|
one-sec-cv12/chunk_156 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 22243660272.375
num_examples: 231589
download_size: 19717789179
dataset_size: 22243660272.375
---
# Dataset Card for "chunk_156"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KonstantyM/science_qa | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 7497499873
num_examples: 4432703
download_size: 4282191598
dataset_size: 7497499873
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "science_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_wrong_num_v5_full_first_permute | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7568134.963687525
num_examples: 4778
- name: validation
num_bytes: 346484
num_examples: 300
download_size: 1325370
dataset_size: 7914618.963687525
---
# Dataset Card for "squad_qa_wrong_num_v5_full_first_permute"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
texturedesign/td01_natural-ground-textures | ---
annotations_creators:
- expert-generated
language: []
language_creators: []
license:
- cc-by-nc-4.0
multilinguality: []
pretty_name: 'TD01: Natural Ground Texture Photos'
size_categories:
- n<1K
source_datasets:
- original
tags:
- texture-synthesis
- photography
- non-infringing
task_categories:
- unconditional-image-generation
task_ids: []
viewer: false
---
_The Dataset Teaser is now enabled instead! Isn't this better?_

# TD 01: Natural Ground Textures
This dataset contains multi-photo texture captures in outdoor nature scenes — all focusing on the ground. Each set has different photos that showcase texture variety, making them ideal for training a domain-specific image generator!
Overall information about this dataset:
* **Format** — JPEG-XL, lossless RGB
* **Resolution** — 4032 × 2268
* **Device** — mobile camera
* **Technique** — hand-held
* **Orientation** — portrait or landscape
* **Author**: Alex J. Champandard
* **Configurations**: 4K, 2K (default), 1K
To load the medium- and high-resolution images of the dataset, you'll need to install `jxlpy` from [PyPI](https://pypi.org/project/jxlpy/) with `pip install jxlpy`:
```python
# Recommended use, JXL at high-quality.
from jxlpy import JXLImagePlugin
from datasets import load_dataset
d = load_dataset('texturedesign/td01_natural-ground-textures', 'JXL@4K', num_proc=4)
print(len(d['train']), len(d['test']))
```
The lowest-resolution images are available as PNG with a regular installation of `pillow`:
```python
# Alternative use, PNG at low-quality.
from datasets import load_dataset
d = load_dataset('texturedesign/td01_natural-ground-textures', 'PNG@1K', num_proc=4)
# EXAMPLE: Discard all other sets except Set #1.
dataset = dataset.filter(lambda s: s['set'] == 1)
# EXAMPLE: Only keep images with index 0 and 2.
dataset = dataset.select([0, 2])
```
Use built-in dataset `filter()` and `select()` to narrow down the loaded dataset for training, or to ease with development.
## Set #1: Rock and Gravel

* **Description**:
- surface rocks with gravel and coarse sand
- strong sunlight from the left, sharp shadows
* **Number of Photos**:
- 7 train
- 2 test
* **Edits**:
- rotated photos to align sunlight
- removed infrequent objects
* **Size**: 77.8 Mb
## Set #2: Dry Grass with Pine Needles

* **Description**:
- field of dry grass and pine needles
- sunlight from the top right, some shadows
* **Number of Photos**:
- 6 train
- 1 test
* **Edits**:
- removed dry leaves and large plants
- removed sticks, rocks and sporadic daisies
* **Size**: 95.2 Mb
## Set #3: Chipped Stones, Broken Leaves and Twiglets

* **Description**:
- autumn path with chipped stones and dry broken leaves
- diffuse light on a cloudy day, very soft shadows
* **Number of Photos**:
- 9 train
- 3 test
* **Edits**:
- removed anything that looks green, fresh leaves
- removed long sticks and large/odd stones
* **Size**: 126.9 Mb
## Set #4: Grass Clumps and Cracked Dirt

* **Description**:
- clumps of green grass, clover and patches of cracked dirt
- diffuse light on cloudy day, shadows under large blades of grass
* **Number of Photos**:
- 9 train
- 2 test
* **Edits**:
- removed dry leaves, sporadic dandelions, and large objects
- histogram matching for two of the photos so the colors look similar
* **Size**: 126.8 Mb
## Set #5: Dirt, Stones, Rock, Twigs...

* **Description**:
- intricate micro-scene with grey dirt, surface rock, stones, twigs and organic debris
- diffuse light on cloudy day, soft shadows around the larger objects
* **Number of Photos**:
- 9 train
- 3 test
* **Edits**:
- removed odd objects that felt out-of-distribution
* **Size**: 102.1 Mb
## Set #6: Plants with Flowers on Dry Leaves

* **Description**:
- leafy plants with white flowers on a bed of dry brown leaves
- soft diffuse light, shaded areas under the plants
* **Number of Photos**:
- 9 train
- 2 test
* **Edits**:
- none yet, inpainting doesn't work well enough
- would remove long sticks and pieces of wood
* **Size**: 105.1 Mb
## Set #7: Frozen Footpath with Snow

* **Description**:
- frozen ground on a path with footprints
- areas with snow and dark brown ground beneath
- diffuse lighting on a cloudy day
* **Number of Photos**:
- 11 train
- 3 test
* **Size**: 95.5 Mb
## Set #8: Pine Needles Forest Floor

* **Description**:
- forest floor with a mix of brown soil and grass
- variety of dry white leaves, sticks, pinecones, pine needles
- diffuse lighting on a cloudy day
* **Number of Photos**:
- 15 train
- 4 test
* **Size**: 160.6 Mb
## Set #9: Snow on Grass and Dried Leaves

* **Description**:
- field in a park with short green grass
- large dried brown leaves and fallen snow on top
- diffuse lighting on a cloudy day
* **Number of Photos**:
- 8 train
- 3 test
* **Size**: 99.8 Mb
## Set #10: Brown Leaves on Wet Ground

* **Description**:
- fallew brown leaves on wet ground
- occasional tree root and twiglets
- diffuse lighting on a rainy day
* **Number of Photos**:
- 17 train
- 4 test
* **Size**: 186.2 Mb
## Set #11: Wet Sand Path with Debris

* **Description**:
- hard sandy path in the rain
- decomposing leaves and other organic debris
- diffuse lighting on a rainy day
* **Number of Photos**:
- 17 train
- 4 test
* **Size**: 186.2 Mb
## Set #12: Wood Chips & Sawdust Sprinkled on Forest Path

* **Description**:
- wood chips, sawdust, twigs and roots on forest path
- intermittent sunlight with shadows of trees
* **Number of Photos**:
- 8 train
- 2 test
* **Size**: 110.4 Mb
## Set #13: Young Grass Growing in the Dog Park

* **Description**:
- young grass growing in a dog park after overnight rain
- occasional stones, sticks and twigs, pine needles
- diffuse lighting on a cloudy day
* **Number of Photos**:
- 17 train
- 4 test
* **Size**: 193.4 Mb
## Set #14: Wavy Wet Beach Sand

* **Description**:
- wavy wet sand on the beach after the tide retreated
- some dirt and large pieces algae debris
- diffuse lighting on a cloudy day
* **Number of Photos**:
- 11 train
- 3 test
* **Size**: 86.5 Mb
## Set #15: Dry Dirt Road and Debris from Trees

* **Description**:
- dirt road of dry compacted sand with debris on top
- old pine needles and dry brown leaves
- diffuse lighting on a cloudy day
* **Number of Photos**:
- 8 train
- 2 test
* **Size**: 86.9 Mb
## Set #16: Sandy Beach Path with Grass Clumps

* **Description**:
- path with sand and clumps grass heading towards the beach
- occasional blueish stones, leafy weeds, and yellow flowers
- diffuse lighting on a cloudy day
* **Number of Photos**:
- 10 train
- 3 test
* **Size**: 118.8 Mb
## Set #17: Pine Needles and Brown Leaves on Park Floor

* **Description**:
- park floor with predominantly pine needles
- brown leaves from nearby trees, green grass underneath
- diffuse lighting on a cloudy day
* **Number of Photos**:
- 8 train
- 2 test
* **Size**: 99.9 Mb
|
Ankita802/documentation | ---
dataset_info:
features:
- name: input
dtype: string
- name: result
dtype: string
splits:
- name: train
num_bytes: 1700650.6975259378
num_examples: 2004
- name: test
num_bytes: 426011.30247406225
num_examples: 502
download_size: 1090487
dataset_size: 2126662.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Jumtra/jglue_jsquads_with_input | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 44660349
num_examples: 67301
download_size: 8923113
dataset_size: 44660349
---
# Dataset Card for "jglue_jsquads_with_input"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EdgarsKatze/test | ---
configs:
- config_name: default
data_files:
- split: train
path: "test/data-00000-of-00001.arrow"
---
license: other
---
|
apf1/datafilteringnetworks_2b | ---
license: cc-by-4.0
---
### Getting Started
To use the indices, first download CommonPool: https://github.com/mlfoundations/datacomp#downloading-commonpool
Then reshard with the provided indices file: https://github.com/mlfoundations/datacomp#selecting-samples-in-the-filtering-track
### Paper Link
https://arxiv.org/abs/2309.17425 |
lmqg/qag_jaquad | ---
license: cc-by-sa-4.0
pretty_name: SQuAD for question generation
language: ja
multilinguality: monolingual
size_categories: 1k<n<10K
source_datasets: lmqg/qg_jaquad
task_categories:
- text-generation
task_ids:
- language-modeling
tags:
- question-generation
---
# Dataset Card for "lmqg/qag_jaquad"
## Dataset Description
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
- **Point of Contact:** [Asahi Ushio](http://asahiushio.com/)
### Dataset Summary
This is the question & answer generation dataset based on the JAQuAD.
### Supported Tasks and Leaderboards
* `question-answer-generation`: The dataset is assumed to be used to train a model for question & answer generation.
Success on this task is typically measured by achieving a high BLEU4/METEOR/ROUGE-L/BERTScore/MoverScore (see our paper for more in detail).
### Languages
Japanese (ja)
## Dataset Structure
An example of 'train' looks as follows.
```
{
"paragraph": ""Nerdilinga"は898年にカロリング朝の王領として初めて文献に記録されている。レーゲンスブルク司教の統治下でネルトリンゲンは市場町に成長していった。1215年にネルトリンゲンは皇帝フリードリヒ2世から都市権を与えられ、帝国自由都市となった。この年に最初の市壁が築かれた。その縄張りは現在も街の地図に見て取れる。1219年、ネルトリンゲンの聖霊降臨祭についての最も古い文献上の記録が遺されている。重要な交易路が交差するこの都市は穀物、家畜、織物、毛皮、金属製品の主要な集散地に発展していった。ネルトリンゲンはフランクフルトと並ぶドイツで最も重要な遠距離交易都市の一つとなったのである。",
"questions": [ "1215年にネルトリンゲンは誰から都市権を与えられ、帝国自由都市となったか。", "\"Nerdilinga\"の最初の記録は何年のものですか。" ],
"answers": [ "皇帝フリードリヒ2世", "898年" ],
"questions_answers": "question: 1215年にネルトリンゲンは誰から都市権を与えられ、帝国自由都市となったか。, answer: 皇帝フリードリヒ2世 | question: "Nerdilinga"の最初の記録は何年のものですか。, answer: 898年"
}
```
The data fields are the same among all splits.
- `questions`: a `list` of `string` features.
- `answers`: a `list` of `string` features.
- `paragraph`: a `string` feature.
- `questions_answers`: a `string` feature.
## Data Splits
|train|validation|test |
|----:|---------:|----:|
|9508| 1431 | 3050|
## Citation Information
```
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}
``` |
irds/codec_politics | ---
pretty_name: '`codec/politics`'
viewer: false
source_datasets: ['irds/codec']
task_categories:
- text-retrieval
---
# Dataset Card for `codec/politics`
The `codec/politics` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/codec#codec/politics).
# Data
This dataset provides:
- `queries` (i.e., topics); count=14
- `qrels`: (relevance assessments); count=2,192
- For `docs`, use [`irds/codec`](https://huggingface.co/datasets/irds/codec)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/codec_politics', 'queries')
for record in queries:
record # {'query_id': ..., 'query': ..., 'domain': ..., 'guidelines': ...}
qrels = load_dataset('irds/codec_politics', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{mackie2022codec,
title={CODEC: Complex Document and Entity Collection},
author={Mackie, Iain and Owoicho, Paul and Gemmell, Carlos and Fischer, Sophie and MacAvaney, Sean and Dalton, Jeffery},
booktitle={Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval},
year={2022}
}
```
|
open-llm-leaderboard/details_haoranxu__ALMA-13B-Pretrain | ---
pretty_name: Evaluation run of haoranxu/ALMA-13B-Pretrain
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [haoranxu/ALMA-13B-Pretrain](https://huggingface.co/haoranxu/ALMA-13B-Pretrain)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_haoranxu__ALMA-13B-Pretrain\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T19:44:22.672013](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B-Pretrain/blob/main/results_2023-10-24T19-44-22.672013.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788269345,\n \"f1\": 0.0558294882550337,\n\
\ \"f1_stderr\": 0.0013237506266727554,\n \"acc\": 0.4263565172486631,\n\
\ \"acc_stderr\": 0.00988264379366717\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788269345,\n\
\ \"f1\": 0.0558294882550337,\n \"f1_stderr\": 0.0013237506266727554\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0887035633055345,\n \
\ \"acc_stderr\": 0.007831458737058719\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275623\n\
\ }\n}\n```"
repo_url: https://huggingface.co/haoranxu/ALMA-13B-Pretrain
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|arc:challenge|25_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T19_44_22.672013
path:
- '**/details_harness|drop|3_2023-10-24T19-44-22.672013.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T19-44-22.672013.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T19_44_22.672013
path:
- '**/details_harness|gsm8k|5_2023-10-24T19-44-22.672013.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T19-44-22.672013.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hellaswag|10_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-16-28.187729.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T18-16-28.187729.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T18-16-28.187729.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T19_44_22.672013
path:
- '**/details_harness|winogrande|5_2023-10-24T19-44-22.672013.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T19-44-22.672013.parquet'
- config_name: results
data_files:
- split: 2023_10_03T18_16_28.187729
path:
- results_2023-10-03T18-16-28.187729.parquet
- split: 2023_10_24T19_44_22.672013
path:
- results_2023-10-24T19-44-22.672013.parquet
- split: latest
path:
- results_2023-10-24T19-44-22.672013.parquet
---
# Dataset Card for Evaluation run of haoranxu/ALMA-13B-Pretrain
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/haoranxu/ALMA-13B-Pretrain
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [haoranxu/ALMA-13B-Pretrain](https://huggingface.co/haoranxu/ALMA-13B-Pretrain) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_haoranxu__ALMA-13B-Pretrain",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T19:44:22.672013](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B-Pretrain/blob/main/results_2023-10-24T19-44-22.672013.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788269345,
"f1": 0.0558294882550337,
"f1_stderr": 0.0013237506266727554,
"acc": 0.4263565172486631,
"acc_stderr": 0.00988264379366717
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788269345,
"f1": 0.0558294882550337,
"f1_stderr": 0.0013237506266727554
},
"harness|gsm8k|5": {
"acc": 0.0887035633055345,
"acc_stderr": 0.007831458737058719
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275623
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Zack157/CBV3 | ---
license: openrail
---
|
Nolan1206/WhisperSmallTest200001 | ---
dataset_info:
features:
- name: audio
sequence: float32
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 113600198
num_examples: 534
- name: test
num_bytes: 113600198
num_examples: 534
download_size: 228223568
dataset_size: 227200396
---
# Dataset Card for "WhisperSmallTest200001"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aravind-selvam/chart_data_y | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 52790132.0
num_examples: 4000
- name: validation
num_bytes: 13198764.0
num_examples: 1000
download_size: 63225345
dataset_size: 65988896.0
---
# Dataset Card for "chart_data_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chahs/lotr-book | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2448015
num_examples: 1
download_size: 1452235
dataset_size: 2448015
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Joe02/obui | ---
license: other
---
|
j0hngou/ccmatrix_de-en | ---
language:
- en
- de
---
A sampled version of the [CCMatrix](https://huggingface.co/datasets/yhavinga/ccmatrix) dataset for the German-English pair, containing 1M train entries. |
Hack90/ncbi_genbank_part_61 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 32323629638
num_examples: 119131
download_size: 14698797773
dataset_size: 32323629638
---
# Dataset Card for "ncbi_genbank_part_61"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EleutherAI/quirky_subtraction_increment0_alice_hard | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 3169094.7865260416
num_examples: 48047
- name: validation
num_bytes: 66834.1945
num_examples: 1013
- name: test
num_bytes: 66270.2025
num_examples: 1005
download_size: 1193353
dataset_size: 3302199.183526042
---
# Dataset Card for "quirky_subtraction_increment0_alice_hard"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eljanmahammadli/sample_data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 20016121.541696552
num_examples: 10000
download_size: 12085040
dataset_size: 20016121.541696552
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
emozilla/openchat-data | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 93110051
num_examples: 6206
download_size: 40874409
dataset_size: 93110051
---
# Dataset Card for "openchat-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carolfgadelha/testbench | ---
license: unknown
---
|
zolak/twitter_dataset_81_1713135148 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 410422
num_examples: 1039
download_size: 215646
dataset_size: 410422
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
reza-alipour/landmark-m3 | ---
dataset_info:
features:
- name: id
dtype: string
- name: mask
dtype: image
- name: caption
dtype: string
- name: generated_image
dtype: image
splits:
- name: train
num_bytes: 575737385.5
num_examples: 1498
download_size: 570677758
dataset_size: 575737385.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kye/all-lucidrain-code-python-tokenized-8192 | ---
dataset_info:
features:
- name: repo_name
sequence: string
- name: file_path
sequence: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 2299336
num_examples: 21
download_size: 349131
dataset_size: 2299336
---
# Dataset Card for "all-lucidrain-code-python-tokenized-8192"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dsupa/hack5-IQ-FFT | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
splits:
- name: train
num_bytes: 4657031.0
num_examples: 647
download_size: 4635221
dataset_size: 4657031.0
---
# Dataset Card for "hack5-IQ-FFT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-base | ---
pretty_name: Evaluation run of deepseek-ai/deepseek-llm-67b-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [deepseek-ai/deepseek-llm-67b-base](https://huggingface.co/deepseek-ai/deepseek-llm-67b-base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-08T15:08:33.397139](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-base/blob/main/results_2023-12-08T15-08-33.397139.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7152016597064887,\n\
\ \"acc_stderr\": 0.029610855644222223,\n \"acc_norm\": 0.7193168663591899,\n\
\ \"acc_norm_stderr\": 0.030182859586413653,\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756544,\n \"mc2\": 0.5108013665291756,\n\
\ \"mc2_stderr\": 0.014538753767819627\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759096,\n\
\ \"acc_norm\": 0.6544368600682594,\n \"acc_norm_stderr\": 0.01389693846114568\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6783509261103365,\n\
\ \"acc_stderr\": 0.004661544991583034,\n \"acc_norm\": 0.8710416251742681,\n\
\ \"acc_norm_stderr\": 0.003344689038650325\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.03197565821032499,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.03197565821032499\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.83,\n\
\ \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7811320754716982,\n \"acc_stderr\": 0.025447863825108614,\n\
\ \"acc_norm\": 0.7811320754716982,\n \"acc_norm_stderr\": 0.025447863825108614\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565677,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565677\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n\
\ \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.7341040462427746,\n\
\ \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7319148936170212,\n \"acc_stderr\": 0.028957342788342343,\n\
\ \"acc_norm\": 0.7319148936170212,\n \"acc_norm_stderr\": 0.028957342788342343\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6827586206896552,\n \"acc_stderr\": 0.03878352372138622,\n\
\ \"acc_norm\": 0.6827586206896552,\n \"acc_norm_stderr\": 0.03878352372138622\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5026455026455027,\n \"acc_stderr\": 0.025750949678130387,\n \"\
acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.025750949678130387\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n\
\ \"acc_stderr\": 0.021576248184514587,\n \"acc_norm\": 0.8258064516129032,\n\
\ \"acc_norm_stderr\": 0.021576248184514587\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.034819048444388045,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.034819048444388045\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503582,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503582\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.898989898989899,\n \"acc_stderr\": 0.02146973557605533,\n \"acc_norm\"\
: 0.898989898989899,\n \"acc_norm_stderr\": 0.02146973557605533\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084313,\n\
\ \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084313\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7307692307692307,\n \"acc_stderr\": 0.02248938979365483,\n \
\ \"acc_norm\": 0.7307692307692307,\n \"acc_norm_stderr\": 0.02248938979365483\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465715,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465715\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.024762902678057922,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.024762902678057922\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9045871559633027,\n \"acc_stderr\": 0.012595899282335812,\n \"\
acc_norm\": 0.9045871559633027,\n \"acc_norm_stderr\": 0.012595899282335812\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9156118143459916,\n \"acc_stderr\": 0.01809424711647333,\n \
\ \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.01809424711647333\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807193,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807193\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n\
\ \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881348,\n\
\ \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881348\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n\
\ \"acc_stderr\": 0.016534627684311368,\n \"acc_norm\": 0.9316239316239316,\n\
\ \"acc_norm_stderr\": 0.016534627684311368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9003831417624522,\n\
\ \"acc_stderr\": 0.010709685591251671,\n \"acc_norm\": 0.9003831417624522,\n\
\ \"acc_norm_stderr\": 0.010709685591251671\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n\
\ \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n\
\ \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.02342037547829613,\n\
\ \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.02342037547829613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.8135048231511254,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8549382716049383,\n \"acc_stderr\": 0.01959487701972795,\n\
\ \"acc_norm\": 0.8549382716049383,\n \"acc_norm_stderr\": 0.01959487701972795\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.560625814863103,\n\
\ \"acc_stderr\": 0.012676014778580219,\n \"acc_norm\": 0.560625814863103,\n\
\ \"acc_norm_stderr\": 0.012676014778580219\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7757352941176471,\n \"acc_stderr\": 0.025336848563332372,\n\
\ \"acc_norm\": 0.7757352941176471,\n \"acc_norm_stderr\": 0.025336848563332372\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8022875816993464,\n \"acc_stderr\": 0.016112443369726736,\n \
\ \"acc_norm\": 0.8022875816993464,\n \"acc_norm_stderr\": 0.016112443369726736\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514279,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514279\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.0211662163046594,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.0211662163046594\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.02386832565759419,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.02386832565759419\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756544,\n \"mc2\": 0.5108013665291756,\n\
\ \"mc2_stderr\": 0.014538753767819627\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.01026793624302822\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5670962850644428,\n \
\ \"acc_stderr\": 0.013647916362576052\n }\n}\n```"
repo_url: https://huggingface.co/deepseek-ai/deepseek-llm-67b-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|arc:challenge|25_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|gsm8k|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hellaswag|10_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T15-08-33.397139.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-08T15-08-33.397139.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- '**/details_harness|winogrande|5_2023-12-08T15-08-33.397139.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-08T15-08-33.397139.parquet'
- config_name: results
data_files:
- split: 2023_12_08T15_08_33.397139
path:
- results_2023-12-08T15-08-33.397139.parquet
- split: latest
path:
- results_2023-12-08T15-08-33.397139.parquet
---
# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-67b-base
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/deepseek-ai/deepseek-llm-67b-base
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-llm-67b-base](https://huggingface.co/deepseek-ai/deepseek-llm-67b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T15:08:33.397139](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-base/blob/main/results_2023-12-08T15-08-33.397139.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7152016597064887,
"acc_stderr": 0.029610855644222223,
"acc_norm": 0.7193168663591899,
"acc_norm_stderr": 0.030182859586413653,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756544,
"mc2": 0.5108013665291756,
"mc2_stderr": 0.014538753767819627
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759096,
"acc_norm": 0.6544368600682594,
"acc_norm_stderr": 0.01389693846114568
},
"harness|hellaswag|10": {
"acc": 0.6783509261103365,
"acc_stderr": 0.004661544991583034,
"acc_norm": 0.8710416251742681,
"acc_norm_stderr": 0.003344689038650325
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.03197565821032499,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.03197565821032499
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7811320754716982,
"acc_stderr": 0.025447863825108614,
"acc_norm": 0.7811320754716982,
"acc_norm_stderr": 0.025447863825108614
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565677,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565677
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7319148936170212,
"acc_stderr": 0.028957342788342343,
"acc_norm": 0.7319148936170212,
"acc_norm_stderr": 0.028957342788342343
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6827586206896552,
"acc_stderr": 0.03878352372138622,
"acc_norm": 0.6827586206896552,
"acc_norm_stderr": 0.03878352372138622
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.025750949678130387,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.025750949678130387
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514587,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514587
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.030117688929503582,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.030117688929503582
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.898989898989899,
"acc_stderr": 0.02146973557605533,
"acc_norm": 0.898989898989899,
"acc_norm_stderr": 0.02146973557605533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084313,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084313
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.02248938979365483,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.02248938979365483
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465715,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465715
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.024762902678057922,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.024762902678057922
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9045871559633027,
"acc_stderr": 0.012595899282335812,
"acc_norm": 0.9045871559633027,
"acc_norm_stderr": 0.012595899282335812
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.01809424711647333,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.01809424711647333
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807193,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807193
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881348,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311368,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9003831417624522,
"acc_stderr": 0.010709685591251671,
"acc_norm": 0.9003831417624522,
"acc_norm_stderr": 0.010709685591251671
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.01663583834163192,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.01663583834163192
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.02342037547829613,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.02342037547829613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8135048231511254,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.8135048231511254,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8549382716049383,
"acc_stderr": 0.01959487701972795,
"acc_norm": 0.8549382716049383,
"acc_norm_stderr": 0.01959487701972795
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.560625814863103,
"acc_stderr": 0.012676014778580219,
"acc_norm": 0.560625814863103,
"acc_norm_stderr": 0.012676014778580219
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7757352941176471,
"acc_stderr": 0.025336848563332372,
"acc_norm": 0.7757352941176471,
"acc_norm_stderr": 0.025336848563332372
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8022875816993464,
"acc_stderr": 0.016112443369726736,
"acc_norm": 0.8022875816993464,
"acc_norm_stderr": 0.016112443369726736
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514279,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514279
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.0211662163046594,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.0211662163046594
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.02386832565759419,
"acc_norm": 0.94,
"acc_norm_stderr": 0.02386832565759419
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756544,
"mc2": 0.5108013665291756,
"mc2_stderr": 0.014538753767819627
},
"harness|winogrande|5": {
"acc": 0.8413575374901342,
"acc_stderr": 0.01026793624302822
},
"harness|gsm8k|5": {
"acc": 0.5670962850644428,
"acc_stderr": 0.013647916362576052
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Indic-Benchmark/malayalam-arc-c-2.5k | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
struct:
- name: choices
list:
- name: label
dtype: string
- name: text
dtype: string
- name: stem
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 2289488
num_examples: 2536
download_size: 827971
dataset_size: 2289488
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_wang7776__Llama-2-7b-chat-hf-30-attention-sparsity | ---
pretty_name: Evaluation run of wang7776/Llama-2-7b-chat-hf-30-attention-sparsity
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wang7776/Llama-2-7b-chat-hf-30-attention-sparsity](https://huggingface.co/wang7776/Llama-2-7b-chat-hf-30-attention-sparsity)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wang7776__Llama-2-7b-chat-hf-30-attention-sparsity\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-26T21:28:33.090458](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Llama-2-7b-chat-hf-30-attention-sparsity/blob/main/results_2024-01-26T21-28-33.090458.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47171176555273875,\n\
\ \"acc_stderr\": 0.03427847553885065,\n \"acc_norm\": 0.4765170024786929,\n\
\ \"acc_norm_stderr\": 0.03503452138702699,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.01590598704818483,\n \"mc2\": 0.450180283055029,\n\
\ \"mc2_stderr\": 0.015612058311126043\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4974402730375427,\n \"acc_stderr\": 0.014611199329843784,\n\
\ \"acc_norm\": 0.5341296928327645,\n \"acc_norm_stderr\": 0.014577311315231102\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5805616411073491,\n\
\ \"acc_stderr\": 0.004924586362301656,\n \"acc_norm\": 0.76867157936666,\n\
\ \"acc_norm_stderr\": 0.004208200511232451\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500476,\n\
\ \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500476\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.34104046242774566,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.34104046242774566,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835362,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835362\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.02339382650048487,\n \"acc_norm\"\
: 0.291005291005291,\n \"acc_norm_stderr\": 0.02339382650048487\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924316,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5258064516129032,\n\
\ \"acc_stderr\": 0.028406095057653326,\n \"acc_norm\": 0.5258064516129032,\n\
\ \"acc_norm_stderr\": 0.028406095057653326\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.0381549430868893,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.0381549430868893\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5808080808080808,\n \"acc_stderr\": 0.03515520728670417,\n \"\
acc_norm\": 0.5808080808080808,\n \"acc_norm_stderr\": 0.03515520728670417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n\
\ \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4025641025641026,\n \"acc_stderr\": 0.024864995159767752,\n\
\ \"acc_norm\": 0.4025641025641026,\n \"acc_norm_stderr\": 0.024864995159767752\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514566,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514566\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40336134453781514,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.40336134453781514,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6605504587155964,\n \"acc_stderr\": 0.02030210934266235,\n \"\
acc_norm\": 0.6605504587155964,\n \"acc_norm_stderr\": 0.02030210934266235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6225490196078431,\n \"acc_stderr\": 0.03402272044340705,\n \"\
acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.03402272044340705\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.620253164556962,\n \"acc_stderr\": 0.03159188752965851,\n \
\ \"acc_norm\": 0.620253164556962,\n \"acc_norm_stderr\": 0.03159188752965851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.03922378290610991,\n\
\ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.03922378290610991\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7094017094017094,\n\
\ \"acc_stderr\": 0.029745048572674074,\n \"acc_norm\": 0.7094017094017094,\n\
\ \"acc_norm_stderr\": 0.029745048572674074\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.01685739124747255,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.01685739124747255\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.02691504735536981,\n\
\ \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.02691504735536981\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5065359477124183,\n \"acc_stderr\": 0.028627470550556054,\n\
\ \"acc_norm\": 0.5065359477124183,\n \"acc_norm_stderr\": 0.028627470550556054\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5498392282958199,\n\
\ \"acc_stderr\": 0.02825666072336018,\n \"acc_norm\": 0.5498392282958199,\n\
\ \"acc_norm_stderr\": 0.02825666072336018\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5679012345679012,\n \"acc_stderr\": 0.027563010971606672,\n\
\ \"acc_norm\": 0.5679012345679012,\n \"acc_norm_stderr\": 0.027563010971606672\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347243,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347243\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35267275097783574,\n\
\ \"acc_stderr\": 0.012203286846053886,\n \"acc_norm\": 0.35267275097783574,\n\
\ \"acc_norm_stderr\": 0.012203286846053886\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40441176470588236,\n \"acc_stderr\": 0.029812630701569736,\n\
\ \"acc_norm\": 0.40441176470588236,\n \"acc_norm_stderr\": 0.029812630701569736\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46895424836601307,\n \"acc_stderr\": 0.020188804456361883,\n \
\ \"acc_norm\": 0.46895424836601307,\n \"acc_norm_stderr\": 0.020188804456361883\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.49387755102040815,\n \"acc_stderr\": 0.03200682020163908,\n\
\ \"acc_norm\": 0.49387755102040815,\n \"acc_norm_stderr\": 0.03200682020163908\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n\
\ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n\
\ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.03851597683718534,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.03851597683718534\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691583,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691583\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.01590598704818483,\n \"mc2\": 0.450180283055029,\n\
\ \"mc2_stderr\": 0.015612058311126043\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7103393843725335,\n \"acc_stderr\": 0.012748550807638261\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17437452615617893,\n \
\ \"acc_stderr\": 0.010451421361976233\n }\n}\n```"
repo_url: https://huggingface.co/wang7776/Llama-2-7b-chat-hf-30-attention-sparsity
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|arc:challenge|25_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|gsm8k|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hellaswag|10_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T21-28-33.090458.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T21-28-33.090458.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- '**/details_harness|winogrande|5_2024-01-26T21-28-33.090458.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-26T21-28-33.090458.parquet'
- config_name: results
data_files:
- split: 2024_01_26T21_28_33.090458
path:
- results_2024-01-26T21-28-33.090458.parquet
- split: latest
path:
- results_2024-01-26T21-28-33.090458.parquet
---
# Dataset Card for Evaluation run of wang7776/Llama-2-7b-chat-hf-30-attention-sparsity
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wang7776/Llama-2-7b-chat-hf-30-attention-sparsity](https://huggingface.co/wang7776/Llama-2-7b-chat-hf-30-attention-sparsity) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wang7776__Llama-2-7b-chat-hf-30-attention-sparsity",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-26T21:28:33.090458](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Llama-2-7b-chat-hf-30-attention-sparsity/blob/main/results_2024-01-26T21-28-33.090458.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47171176555273875,
"acc_stderr": 0.03427847553885065,
"acc_norm": 0.4765170024786929,
"acc_norm_stderr": 0.03503452138702699,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.01590598704818483,
"mc2": 0.450180283055029,
"mc2_stderr": 0.015612058311126043
},
"harness|arc:challenge|25": {
"acc": 0.4974402730375427,
"acc_stderr": 0.014611199329843784,
"acc_norm": 0.5341296928327645,
"acc_norm_stderr": 0.014577311315231102
},
"harness|hellaswag|10": {
"acc": 0.5805616411073491,
"acc_stderr": 0.004924586362301656,
"acc_norm": 0.76867157936666,
"acc_norm_stderr": 0.004208200511232451
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500476,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500476
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835362,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835362
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.02339382650048487,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.02339382650048487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924316,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5258064516129032,
"acc_stderr": 0.028406095057653326,
"acc_norm": 0.5258064516129032,
"acc_norm_stderr": 0.028406095057653326
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5808080808080808,
"acc_stderr": 0.03515520728670417,
"acc_norm": 0.5808080808080808,
"acc_norm_stderr": 0.03515520728670417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6839378238341969,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.6839378238341969,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4025641025641026,
"acc_stderr": 0.024864995159767752,
"acc_norm": 0.4025641025641026,
"acc_norm_stderr": 0.024864995159767752
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514566,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514566
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40336134453781514,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.40336134453781514,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6605504587155964,
"acc_stderr": 0.02030210934266235,
"acc_norm": 0.6605504587155964,
"acc_norm_stderr": 0.02030210934266235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.03402272044340705,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.03402272044340705
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.620253164556962,
"acc_stderr": 0.03159188752965851,
"acc_norm": 0.620253164556962,
"acc_norm_stderr": 0.03159188752965851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5276073619631901,
"acc_stderr": 0.03922378290610991,
"acc_norm": 0.5276073619631901,
"acc_norm_stderr": 0.03922378290610991
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7094017094017094,
"acc_stderr": 0.029745048572674074,
"acc_norm": 0.7094017094017094,
"acc_norm_stderr": 0.029745048572674074
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.01685739124747255,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.01685739124747255
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.02691504735536981,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.02691504735536981
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5065359477124183,
"acc_stderr": 0.028627470550556054,
"acc_norm": 0.5065359477124183,
"acc_norm_stderr": 0.028627470550556054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5498392282958199,
"acc_stderr": 0.02825666072336018,
"acc_norm": 0.5498392282958199,
"acc_norm_stderr": 0.02825666072336018
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5679012345679012,
"acc_stderr": 0.027563010971606672,
"acc_norm": 0.5679012345679012,
"acc_norm_stderr": 0.027563010971606672
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347243,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347243
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35267275097783574,
"acc_stderr": 0.012203286846053886,
"acc_norm": 0.35267275097783574,
"acc_norm_stderr": 0.012203286846053886
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40441176470588236,
"acc_stderr": 0.029812630701569736,
"acc_norm": 0.40441176470588236,
"acc_norm_stderr": 0.029812630701569736
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46895424836601307,
"acc_stderr": 0.020188804456361883,
"acc_norm": 0.46895424836601307,
"acc_norm_stderr": 0.020188804456361883
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49387755102040815,
"acc_stderr": 0.03200682020163908,
"acc_norm": 0.49387755102040815,
"acc_norm_stderr": 0.03200682020163908
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.03851597683718534,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.03851597683718534
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691583,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691583
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.01590598704818483,
"mc2": 0.450180283055029,
"mc2_stderr": 0.015612058311126043
},
"harness|winogrande|5": {
"acc": 0.7103393843725335,
"acc_stderr": 0.012748550807638261
},
"harness|gsm8k|5": {
"acc": 0.17437452615617893,
"acc_stderr": 0.010451421361976233
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zelros/pj-groupama | ---
tags:
- insurance
---
This dataset contains question/answer pairs from a French legal protection insurance (https://www.service-public.fr/particuliers/vosdroits/F3049?lang=en).
The objective of this dataset is to contribute to open source research projects aiming to, for instance:
* fine-tune LLMs on high-quality datasets, specializing them in the insurance domain
* develop new question/answer applications using Retrieval Augmented Generation (RAG) for insurance contracts
* assess the knowledge of language models in the insurance field
* more generally, apply LLMs to the insurance domain for better understanding and increased transparency of this industry.
Other datasets of the same kind are also available - or will be available soon - and are part of this research effort. See here: https://huggingface.co/collections/zelros/legal-protection-insurance-6536e8f389dd48faca78447e
Here is an example of usages of this dataset: https://huggingface.co/spaces/zelros/The-legal-protection-insurance-comparator |
miguelrengi1/Shaco_LoL_Latam | ---
license: apache-2.0
---
|
hbilgen/sap-community | ---
license: unknown
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/a5a6e439 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1331
dataset_size: 186
---
# Dataset Card for "a5a6e439"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit | ---
pretty_name: Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit](https://huggingface.co/robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-12T16:17:26.640039](https://huggingface.co/datasets/open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit/blob/main/results_2024-02-12T16-17-26.640039.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6063275626505344,\n\
\ \"acc_stderr\": 0.0332967902436461,\n \"acc_norm\": 0.6109376151305722,\n\
\ \"acc_norm_stderr\": 0.03397264723447794,\n \"mc1\": 0.5189718482252142,\n\
\ \"mc1_stderr\": 0.017490896405762357,\n \"mc2\": 0.6728616152193471,\n\
\ \"mc2_stderr\": 0.015267659398484597\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520763,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.657837084246166,\n\
\ \"acc_stderr\": 0.004734642167493353,\n \"acc_norm\": 0.8455486954789883,\n\
\ \"acc_norm_stderr\": 0.003606422623639926\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520193,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520193\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.667741935483871,\n \"acc_stderr\": 0.0267955608481228,\n \"acc_norm\"\
: 0.667741935483871,\n \"acc_norm_stderr\": 0.0267955608481228\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n\
\ \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n\
\ \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.031353050095330855,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.031353050095330855\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \
\ \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7981651376146789,\n \"acc_stderr\": 0.01720857935778758,\n \"\
acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.01720857935778758\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.02220930907316561,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.02220930907316561\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n\
\ \"acc_stderr\": 0.016175692013381954,\n \"acc_norm\": 0.37318435754189944,\n\
\ \"acc_norm_stderr\": 0.016175692013381954\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.02584224870090217,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.02584224870090217\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.012635799922765843,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.012635799922765843\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.01972205893961806,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.01972205893961806\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5189718482252142,\n\
\ \"mc1_stderr\": 0.017490896405762357,\n \"mc2\": 0.6728616152193471,\n\
\ \"mc2_stderr\": 0.015267659398484597\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025393\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40333586050037906,\n \
\ \"acc_stderr\": 0.013512654781814697\n }\n}\n```"
repo_url: https://huggingface.co/robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|arc:challenge|25_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|arc:challenge|25_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|gsm8k|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|gsm8k|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hellaswag|10_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hellaswag|10_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T23-55-28.657040.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T16-17-26.640039.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T16-17-26.640039.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- '**/details_harness|winogrande|5_2024-02-11T23-55-28.657040.parquet'
- split: 2024_02_12T16_17_26.640039
path:
- '**/details_harness|winogrande|5_2024-02-12T16-17-26.640039.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-12T16-17-26.640039.parquet'
- config_name: results
data_files:
- split: 2024_02_11T23_55_28.657040
path:
- results_2024-02-11T23-55-28.657040.parquet
- split: 2024_02_12T16_17_26.640039
path:
- results_2024-02-12T16-17-26.640039.parquet
- split: latest
path:
- results_2024-02-12T16-17-26.640039.parquet
---
# Dataset Card for Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit](https://huggingface.co/robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T16:17:26.640039](https://huggingface.co/datasets/open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit/blob/main/results_2024-02-12T16-17-26.640039.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6063275626505344,
"acc_stderr": 0.0332967902436461,
"acc_norm": 0.6109376151305722,
"acc_norm_stderr": 0.03397264723447794,
"mc1": 0.5189718482252142,
"mc1_stderr": 0.017490896405762357,
"mc2": 0.6728616152193471,
"mc2_stderr": 0.015267659398484597
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520763,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000326
},
"harness|hellaswag|10": {
"acc": 0.657837084246166,
"acc_stderr": 0.004734642167493353,
"acc_norm": 0.8455486954789883,
"acc_norm_stderr": 0.003606422623639926
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520193,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520193
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.0267955608481228,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.0267955608481228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.031353050095330855,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.031353050095330855
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.558974358974359,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.558974358974359,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.01720857935778758,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.01720857935778758
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316561,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316561
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688225,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688225
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.016175692013381954,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.016175692013381954
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.02584224870090217,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.02584224870090217
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765843,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765843
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.01972205893961806,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.01972205893961806
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5189718482252142,
"mc1_stderr": 0.017490896405762357,
"mc2": 0.6728616152193471,
"mc2_stderr": 0.015267659398484597
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025393
},
"harness|gsm8k|5": {
"acc": 0.40333586050037906,
"acc_stderr": 0.013512654781814697
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Test157t__Kunocchini-7b | ---
pretty_name: Evaluation run of Test157t/Kunocchini-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Test157t/Kunocchini-7b](https://huggingface.co/Test157t/Kunocchini-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Kunocchini-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T05:12:00.698748](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Kunocchini-7b/blob/main/results_2024-02-14T05-12-00.698748.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6398558023445645,\n\
\ \"acc_stderr\": 0.03250448541726538,\n \"acc_norm\": 0.6434251517105837,\n\
\ \"acc_norm_stderr\": 0.03315134190382154,\n \"mc1\": 0.5140758873929009,\n\
\ \"mc1_stderr\": 0.01749656371704278,\n \"mc2\": 0.6862179059406133,\n\
\ \"mc2_stderr\": 0.015230575859702986\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6484641638225256,\n \"acc_stderr\": 0.013952413699600935,\n\
\ \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729124\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7058354909380602,\n\
\ \"acc_stderr\": 0.00454735017928625,\n \"acc_norm\": 0.8684524995020912,\n\
\ \"acc_norm_stderr\": 0.003373073863582292\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724053,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724053\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959217,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372174,\n\
\ \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372174\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.015990154885073354,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.015990154885073354\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069425,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069425\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709695,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709695\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.01414397027665757,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.01414397027665757\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45251396648044695,\n\
\ \"acc_stderr\": 0.016646914804438778,\n \"acc_norm\": 0.45251396648044695,\n\
\ \"acc_norm_stderr\": 0.016646914804438778\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818774,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818774\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303956,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303956\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675592,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675592\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5140758873929009,\n\
\ \"mc1_stderr\": 0.01749656371704278,\n \"mc2\": 0.6862179059406133,\n\
\ \"mc2_stderr\": 0.015230575859702986\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089696\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4783927217589083,\n \
\ \"acc_stderr\": 0.013759618667051771\n }\n}\n```"
repo_url: https://huggingface.co/Test157t/Kunocchini-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|arc:challenge|25_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|gsm8k|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hellaswag|10_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T05-12-00.698748.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T05-12-00.698748.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- '**/details_harness|winogrande|5_2024-02-14T05-12-00.698748.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T05-12-00.698748.parquet'
- config_name: results
data_files:
- split: 2024_02_14T05_12_00.698748
path:
- results_2024-02-14T05-12-00.698748.parquet
- split: latest
path:
- results_2024-02-14T05-12-00.698748.parquet
---
# Dataset Card for Evaluation run of Test157t/Kunocchini-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/Kunocchini-7b](https://huggingface.co/Test157t/Kunocchini-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__Kunocchini-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T05:12:00.698748](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Kunocchini-7b/blob/main/results_2024-02-14T05-12-00.698748.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6398558023445645,
"acc_stderr": 0.03250448541726538,
"acc_norm": 0.6434251517105837,
"acc_norm_stderr": 0.03315134190382154,
"mc1": 0.5140758873929009,
"mc1_stderr": 0.01749656371704278,
"mc2": 0.6862179059406133,
"mc2_stderr": 0.015230575859702986
},
"harness|arc:challenge|25": {
"acc": 0.6484641638225256,
"acc_stderr": 0.013952413699600935,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729124
},
"harness|hellaswag|10": {
"acc": 0.7058354909380602,
"acc_stderr": 0.00454735017928625,
"acc_norm": 0.8684524995020912,
"acc_norm_stderr": 0.003373073863582292
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724053,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724053
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.029213549414372174,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.029213549414372174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073354,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073354
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069425,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069425
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709695,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709695
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.01414397027665757,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.01414397027665757
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45251396648044695,
"acc_stderr": 0.016646914804438778,
"acc_norm": 0.45251396648044695,
"acc_norm_stderr": 0.016646914804438778
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818774,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818774
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303956,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824873,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675592,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675592
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5140758873929009,
"mc1_stderr": 0.01749656371704278,
"mc2": 0.6862179059406133,
"mc2_stderr": 0.015230575859702986
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089696
},
"harness|gsm8k|5": {
"acc": 0.4783927217589083,
"acc_stderr": 0.013759618667051771
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
9wimu9/sinhala_eli5 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: title
dtype: string
- name: text
sequence: string
splits:
- name: train
num_bytes: 487152126
num_examples: 109215
download_size: 200153932
dataset_size: 487152126
---
# Dataset Card for "eli5_6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
guyhadad01/manipulations2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 26663
num_examples: 248
- name: test
num_bytes: 6976
num_examples: 63
download_size: 21433
dataset_size: 33639
---
# Dataset Card for "manipulations2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-deepset__germanquad-7176bd7d-11875590 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- deepset/germanquad
eval_info:
task: extractive_question_answering
model: deepset/gelectra-large-germanquad
metrics: []
dataset_name: deepset/germanquad
dataset_config: plain_text
dataset_split: test
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: deepset/gelectra-large-germanquad
* Dataset: deepset/germanquad
* Config: plain_text
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sjlree](https://huggingface.co/sjlree) for evaluating this model. |
bys2058/SD_1213 | ---
dataset_info:
features:
- name: image
dtype: image
- name: original_hairmask
dtype: image
- name: result_image
dtype: image
- name: result_hairmask
dtype: image
- name: image_caption
dtype: string
splits:
- name: train
num_bytes: 120067863874.361
num_examples: 69477
download_size: 116951622981
dataset_size: 120067863874.361
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "SD_1213"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_snnxor_l1_2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence:
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 1550000000
num_examples: 100000
- name: validation
num_bytes: 155000000
num_examples: 10000
- name: test
num_bytes: 155000000
num_examples: 10000
download_size: 1065205233
dataset_size: 1860000000
---
# Dataset Card for "autotree_snnxor_l1_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arthurmluz/wikilingua_data-wiki_gptextsum2_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 23458671
num_examples: 8165
download_size: 13912456
dataset_size: 23458671
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "wikilingua_data-wiki-gptextsum2_results"
rouge= {'rouge1': 0.3243773799776409, 'rouge2': 0.1149886262779723, 'rougeL': 0.22747057611087074, 'rougeLsum': 0.22747057611087074}
bert= {'precision': 0.7288737272457045, 'recall': 0.7672372243810269, 'f1': 0.7467090841312023}
mover = 0.6081768530416082 |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-122000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 641595
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/mandragora_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Mandragora/マンドラゴラ/蔓德拉 (Arknights)
This is the dataset of Mandragora/マンドラゴラ/蔓德拉 (Arknights), containing 69 images and their tags.
The core tags of this character are `animal_ears, cat_ears, short_hair, black_hair, animal_ear_fluff, breasts, hair_ornament, brown_eyes, cat_girl, yellow_eyes, cat_tail, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 69 | 118.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mandragora_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 69 | 100.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mandragora_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 185 | 193.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mandragora_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mandragora_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, collarbone, solo, black_choker, looking_at_viewer, off-shoulder_dress, strapless_dress, white_background, simple_background, small_breasts, upper_body |
| 1 | 8 |  |  |  |  |  | 1girl, bare_shoulders, black_choker, black_dress, black_footwear, looking_at_viewer, solo, full_body, long_sleeves, short_dress, white_thighhighs, boots, off-shoulder_dress, parted_bangs, standing, holding, simple_background, closed_mouth, collarbone, open_clothes, strapless_dress, black_jacket, cleavage, medium_breasts, smile, white_background, wide_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_dress | collarbone | solo | black_choker | looking_at_viewer | off-shoulder_dress | strapless_dress | white_background | simple_background | small_breasts | upper_body | black_footwear | full_body | long_sleeves | short_dress | white_thighhighs | boots | parted_bangs | standing | holding | closed_mouth | open_clothes | black_jacket | cleavage | medium_breasts | smile | wide_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:-------------|:-------|:---------------|:--------------------|:---------------------|:------------------|:-------------------|:--------------------|:----------------|:-------------|:-----------------|:------------|:---------------|:--------------|:-------------------|:--------|:---------------|:-----------|:----------|:---------------|:---------------|:---------------|:-----------|:-----------------|:--------|:---------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
apollo-research/sae-monology-pile-uncopyrighted-tokenizer-EleutherAI-gpt-neox-20b | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 33819097428.0
num_examples: 4126293
download_size: 14321735206
dataset_size: 33819097428.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pradeep239/philp_plain_only5Years | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 377350598.0
num_examples: 793
- name: validation
num_bytes: 43159301.0
num_examples: 94
- name: test
num_bytes: 22114074.0
num_examples: 47
download_size: 320085727
dataset_size: 442623973.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
ethansimrm/patTR_biomed_raw | ---
license: cc-by-nc-sa-3.0
---
|
GGital/Signal_Test01 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
splits:
- name: train
num_bytes: 11566389.0
num_examples: 647
download_size: 11525815
dataset_size: 11566389.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Back-up/facebook_comment_augmentation | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 216029564
num_examples: 1414049
download_size: 115164533
dataset_size: 216029564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lowdewijk/debateavegan_prompts | ---
license: apache-2.0
---
This dataset contains:
* All reddit posts (submission) and comments of the subreddit reddit.com/r/debateavegan from Jan 20 2011 till Oct 30 2022.
* A set of prompts extracted LLMs fine-tuning based on the highest scoring top-level comments to each reddit post. |
ASSERT-KTH/APR-single-hunk-fine-tuning | ---
license: mit
---
This dataset is used to fine-tune LLMs for automated program repair at single-function level, especially for single-hunk bugs. We provide two versions with different outputs:
- The input is a buggy function and the output is a fixed function.
- The input is a buggy function and the output is a unified diff for the bug.
|
ZhangShenao/0.0001_idpo_same_nodpo_noreplacerej_6iters_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: reference_response
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: is_better
dtype: bool
splits:
- name: train_prefs_1
num_bytes: 77484177
num_examples: 10189
- name: test_prefs_1
num_bytes: 15163825
num_examples: 2000
- name: train_prefs_2
num_bytes: 96039405
num_examples: 10189
- name: test_prefs_2
num_bytes: 19003874
num_examples: 2000
- name: train_prefs_3
num_bytes: 88532033
num_examples: 10189
- name: test_prefs_3
num_bytes: 17198474
num_examples: 2000
- name: train_prefs_4
num_bytes: 85402447
num_examples: 10189
- name: test_prefs_4
num_bytes: 16850572
num_examples: 2000
- name: train_prefs_5
num_bytes: 87009005
num_examples: 10189
- name: test_prefs_5
num_bytes: 16916697
num_examples: 2000
- name: train_prefs_6
num_bytes: 85701998
num_examples: 10189
- name: test_prefs_6
num_bytes: 16576019
num_examples: 2000
download_size: 341938586
dataset_size: 621878526
configs:
- config_name: default
data_files:
- split: train_prefs_1
path: data/train_prefs_1-*
- split: test_prefs_1
path: data/test_prefs_1-*
- split: train_prefs_2
path: data/train_prefs_2-*
- split: test_prefs_2
path: data/test_prefs_2-*
- split: train_prefs_3
path: data/train_prefs_3-*
- split: test_prefs_3
path: data/test_prefs_3-*
- split: train_prefs_4
path: data/train_prefs_4-*
- split: test_prefs_4
path: data/test_prefs_4-*
- split: train_prefs_5
path: data/train_prefs_5-*
- split: test_prefs_5
path: data/test_prefs_5-*
- split: train_prefs_6
path: data/train_prefs_6-*
- split: test_prefs_6
path: data/test_prefs_6-*
---
# Dataset Card for "0.0001_idpo_same_nodpo_noreplacerej_6iters_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aaditya/databricks-dolly-15k-Hindi | ---
dataset_info:
features:
- name: en_instruction
dtype: string
- name: en_input
dtype: string
- name: en_output
dtype: string
- name: id
dtype: string
- name: en_category
dtype: string
- name: hindi_instruction
dtype: string
- name: hindi_input
dtype: string
- name: hindi_output
dtype: string
- name: hindi_category
dtype: string
splits:
- name: train
num_bytes: 38525353
num_examples: 15010
download_size: 18858317
dataset_size: 38525353
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- hindi
---
# Summary
`aaditya/databricks-dolly-15k-Hindi` is an open source Hindi version dataset of databricks/databricks-dolly-15k.
This dataset can be used for any purpose, whether academic or commercial, under the terms of the
[Creative Commons Attribution-ShareAlike 3.0 Unported License](https://creativecommons.org/licenses/by-sa/3.0/legalcode).
Supported Tasks:
- Training LLMs
- Synthetic Data Generation
- Data Augmentation
Languages: Hindi
Version: 1.0
Original Dataset repo
https://huggingface.co/datasets/databricks/databricks-dolly-15k/edit/main/README.md
# Citation
```
@misc {dolly_hindi,
author = { Pal, Ankit },
title = { databricks-dolly-15k-Hindi},
year = 2024,
url = { https://huggingface.co/datasets/aaditya/databricks-dolly-15k-Hindi },
doi = { 10.57967/hf/1676 },
publisher = { Hugging Face }
}
``` |
CyberHarem/hibiki_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hibiki/響/响 (Azur Lane)
This is the dataset of hibiki/響/响 (Azur Lane), containing 72 images and their tags.
The core tags of this character are `horns, long_hair, red_eyes, grey_hair, hair_ornament, oni_horns, ahoge`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 72 | 119.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hibiki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 72 | 61.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hibiki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 194 | 135.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hibiki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 72 | 101.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hibiki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 194 | 200.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hibiki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hibiki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, fur_trim, open_mouth, red_skirt, sarashi, smile, solo, white_thighhighs, japanese_clothes, looking_at_viewer, midriff, navel, pleated_skirt, cowboy_shot, long_sleeves, white_background, wide_sleeves, blush, miniskirt, zettai_ryouiki, :3, simple_background, stomach, choker, skindentation, thigh_strap, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | fur_trim | open_mouth | red_skirt | sarashi | smile | solo | white_thighhighs | japanese_clothes | looking_at_viewer | midriff | navel | pleated_skirt | cowboy_shot | long_sleeves | white_background | wide_sleeves | blush | miniskirt | zettai_ryouiki | :3 | simple_background | stomach | choker | skindentation | thigh_strap | thighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------------|:------------|:----------|:--------|:-------|:-------------------|:-------------------|:--------------------|:----------|:--------|:----------------|:--------------|:---------------|:-------------------|:---------------|:--------|:------------|:-----------------|:-----|:--------------------|:----------|:---------|:----------------|:--------------|:---------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
kopyl/fucked-icons-dataset-1024 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 3914349.0
num_examples: 110
- name: validation
num_bytes: 34051.0
num_examples: 1
- name: test
num_bytes: 811457.0
num_examples: 44
download_size: 3972613
dataset_size: 4759857.0
---
# Dataset Card for "icons-dataset-1024"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tornadijo/genei | ---
license: mit
---
|
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_5_500 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 1801
num_examples: 63
download_size: 0
dataset_size: 1801
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_5_500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kristmh/rust_testset | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: text_clean
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 4269204
num_examples: 1570
download_size: 1578842
dataset_size: 4269204
---
# Dataset Card for "rust_testset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
otheng03/test1 | ---
license: apache-2.0
---
# Title 1
hahahoho
|
yzhuang/metatree_fri_c1_1000_5 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 42120
num_examples: 702
- name: validation
num_bytes: 17880
num_examples: 298
download_size: 56736
dataset_size: 60000
---
# Dataset Card for "metatree_fri_c1_1000_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zardat/Data_red | ---
dataset_info:
features:
- name: x
sequence:
sequence: float32
- name: edge_index
sequence:
sequence: float32
- name: edge_attr
sequence: float32
- name: y
dtype: float32
splits:
- name: train
num_bytes: 151462112
num_examples: 1018
download_size: 3005527
dataset_size: 151462112
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Data_red"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/fairness_chef_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_4800 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: scores
sequence: float64
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 2513915
num_examples: 4800
download_size: 236939
dataset_size: 2513915
---
# Dataset Card for "fairness_chef_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_4800"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vganesh46/10-Ks | ---
license: mit
---
|
lucadiliello/english_wikipedia | ---
dataset_info:
features:
- name: filename
dtype: string
- name: maintext
dtype: string
- name: source_domain
dtype: string
- name: title
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 10569005563
num_examples: 4184712
download_size: 6144953788
dataset_size: 10569005563
---
# Dataset Card for "english_wikipedia"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-imdb-ed2a920e-12445656 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- imdb
eval_info:
task: binary_classification
model: lvwerra/distilbert-imdb
metrics: []
dataset_name: imdb
dataset_config: plain_text
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: lvwerra/distilbert-imdb
* Dataset: imdb
* Config: plain_text
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lvwerra](https://huggingface.co/lvwerra) for evaluating this model. |
thanhduycao/data_soict_train_with_entity | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: sentence_norm
dtype: string
splits:
- name: train
num_bytes: 3955020961
num_examples: 11629
- name: test
num_bytes: 389981876
num_examples: 748
download_size: 1036033410
dataset_size: 4345002837
---
# Dataset Card for "data_soict_train_with_entity"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Akass2002/smartvoice | ---
license: creativeml-openrail-m
---
|
huggingartists/the-notorious-big | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/the-notorious-big"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 1.676645 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/664976b54a605d6ac0df2415a8ccac16.564x564x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/the-notorious-big">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Notorious B.I.G.</div>
<a href="https://genius.com/artists/the-notorious-big">
<div style="text-align: center; font-size: 14px;">@the-notorious-big</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/the-notorious-big).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-notorious-big")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|592| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/the-notorious-big")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.