datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
leftattention/DST_MultiWOZ_seq2seq | ---
license: apache-2.0
---
|
CyberHarem/yuki_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yuki/ユキ (Touhou)
This is the dataset of yuki/ユキ (Touhou), containing 112 images and their tags.
The core tags of this character are `blonde_hair, hat, yellow_eyes, short_hair, bow, ribbon, hat_bow, black_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 112 | 75.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuki_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 112 | 55.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuki_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 178 | 97.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuki_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 112 | 70.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuki_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 178 | 120.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuki_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yuki_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | black_skirt, black_vest, puffy_short_sleeves, white_bow, white_shirt, smile, fedora, white_ribbon, 1girl, bangs, solo, collared_shirt, open_mouth, simple_background, shoes, socks, back_bow, black_footwear, breasts, closed_mouth, collared_vest, frills, full_body, hand_on_hip, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, solo, grin |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | black_skirt | black_vest | puffy_short_sleeves | white_bow | white_shirt | smile | fedora | white_ribbon | 1girl | bangs | solo | collared_shirt | open_mouth | simple_background | shoes | socks | back_bow | black_footwear | breasts | closed_mouth | collared_vest | frills | full_body | hand_on_hip | white_background | grin |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------|:-------------|:----------------------|:------------|:--------------|:--------|:---------|:---------------|:--------|:--------|:-------|:-----------------|:-------------|:--------------------|:--------|:--------|:-----------|:-----------------|:----------|:---------------|:----------------|:---------|:------------|:--------------|:-------------------|:-------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | |
| 1 | 7 |  |  |  |  |  | | | | | | | | | X | | X | | | | | | | | | | | | | | | X |
|
open-llm-leaderboard/details_Sao10K__Medusa-13b | ---
pretty_name: Evaluation run of Sao10K/Medusa-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Medusa-13b](https://huggingface.co/Sao10K/Medusa-13b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Medusa-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T23:00:36.340269](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Medusa-13b/blob/main/results_2023-09-22T23-00-36.340269.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08682885906040269,\n\
\ \"em_stderr\": 0.0028836847948924805,\n \"f1\": 0.20613359899328837,\n\
\ \"f1_stderr\": 0.003265939806465616,\n \"acc\": 0.4007308040520042,\n\
\ \"acc_stderr\": 0.009687702523105881\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08682885906040269,\n \"em_stderr\": 0.0028836847948924805,\n\
\ \"f1\": 0.20613359899328837,\n \"f1_stderr\": 0.003265939806465616\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06823351023502654,\n \
\ \"acc_stderr\": 0.006945358944067429\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7332280978689818,\n \"acc_stderr\": 0.012430046102144333\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Medusa-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|arc:challenge|25_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T23_00_36.340269
path:
- '**/details_harness|drop|3_2023-09-22T23-00-36.340269.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T23-00-36.340269.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T23_00_36.340269
path:
- '**/details_harness|gsm8k|5_2023-09-22T23-00-36.340269.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T23-00-36.340269.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hellaswag|10_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T23_00_36.340269
path:
- '**/details_harness|winogrande|5_2023-09-22T23-00-36.340269.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T23-00-36.340269.parquet'
- config_name: results
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- results_2023-08-28T23:11:54.790657.parquet
- split: 2023_09_22T23_00_36.340269
path:
- results_2023-09-22T23-00-36.340269.parquet
- split: latest
path:
- results_2023-09-22T23-00-36.340269.parquet
---
# Dataset Card for Evaluation run of Sao10K/Medusa-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Medusa-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Medusa-13b](https://huggingface.co/Sao10K/Medusa-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Medusa-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T23:00:36.340269](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Medusa-13b/blob/main/results_2023-09-22T23-00-36.340269.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08682885906040269,
"em_stderr": 0.0028836847948924805,
"f1": 0.20613359899328837,
"f1_stderr": 0.003265939806465616,
"acc": 0.4007308040520042,
"acc_stderr": 0.009687702523105881
},
"harness|drop|3": {
"em": 0.08682885906040269,
"em_stderr": 0.0028836847948924805,
"f1": 0.20613359899328837,
"f1_stderr": 0.003265939806465616
},
"harness|gsm8k|5": {
"acc": 0.06823351023502654,
"acc_stderr": 0.006945358944067429
},
"harness|winogrande|5": {
"acc": 0.7332280978689818,
"acc_stderr": 0.012430046102144333
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HuggingFaceH4/cai-conversation-harmless-old | ---
configs:
- config_name: default
data_files:
- split: test_sft
path: data/test_sft-*
- split: test_prefs
path: data/test_prefs-*
- split: train_sft
path: data/train_sft-*
- split: train_prefs
path: data/train_prefs-*
dataset_info:
features:
- name: index
dtype: int64
- name: prompt
dtype: string
- name: init_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: init_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: test_sft
num_bytes: 3784085
num_examples: 1156
- name: test_prefs
num_bytes: 3816979
num_examples: 1156
- name: train_sft
num_bytes: 68593562
num_examples: 21268
- name: train_prefs
num_bytes: 68442112
num_examples: 21269
download_size: 61229214
dataset_size: 144636738
---
# Dataset Card for "cai-conversation-harmless"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Azazelle__Yuna-7b-Merge | ---
pretty_name: Evaluation run of Azazelle/Yuna-7b-Merge
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azazelle/Yuna-7b-Merge](https://huggingface.co/Azazelle/Yuna-7b-Merge) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azazelle__Yuna-7b-Merge\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T01:45:22.523771](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Yuna-7b-Merge/blob/main/results_2024-01-06T01-45-22.523771.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522995558970252,\n\
\ \"acc_stderr\": 0.03212085691453914,\n \"acc_norm\": 0.6527565578888866,\n\
\ \"acc_norm_stderr\": 0.03277773803115889,\n \"mc1\": 0.4455324357405141,\n\
\ \"mc1_stderr\": 0.017399335280140357,\n \"mc2\": 0.6120244185533295,\n\
\ \"mc2_stderr\": 0.015634404794374702\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.013960142600598672,\n\
\ \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729129\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6896036646086438,\n\
\ \"acc_stderr\": 0.004617103280372032,\n \"acc_norm\": 0.8683529177454691,\n\
\ \"acc_norm_stderr\": 0.0033741568675916727\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554963,\n \"\
acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554963\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.02934457250063434,\n \
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.02934457250063434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \"\
acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137276,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137276\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8416347381864623,\n\
\ \"acc_stderr\": 0.013055346753516729,\n \"acc_norm\": 0.8416347381864623,\n\
\ \"acc_norm_stderr\": 0.013055346753516729\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n\
\ \"acc_stderr\": 0.016607021781050876,\n \"acc_norm\": 0.441340782122905,\n\
\ \"acc_norm_stderr\": 0.016607021781050876\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897227,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897227\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4455324357405141,\n\
\ \"mc1_stderr\": 0.017399335280140357,\n \"mc2\": 0.6120244185533295,\n\
\ \"mc2_stderr\": 0.015634404794374702\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491908\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6762699014404853,\n \
\ \"acc_stderr\": 0.012888247397371143\n }\n}\n```"
repo_url: https://huggingface.co/Azazelle/Yuna-7b-Merge
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|arc:challenge|25_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|gsm8k|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hellaswag|10_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T01-45-22.523771.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T01-45-22.523771.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- '**/details_harness|winogrande|5_2024-01-06T01-45-22.523771.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T01-45-22.523771.parquet'
- config_name: results
data_files:
- split: 2024_01_06T01_45_22.523771
path:
- results_2024-01-06T01-45-22.523771.parquet
- split: latest
path:
- results_2024-01-06T01-45-22.523771.parquet
---
# Dataset Card for Evaluation run of Azazelle/Yuna-7b-Merge
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azazelle/Yuna-7b-Merge](https://huggingface.co/Azazelle/Yuna-7b-Merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azazelle__Yuna-7b-Merge",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T01:45:22.523771](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Yuna-7b-Merge/blob/main/results_2024-01-06T01-45-22.523771.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6522995558970252,
"acc_stderr": 0.03212085691453914,
"acc_norm": 0.6527565578888866,
"acc_norm_stderr": 0.03277773803115889,
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140357,
"mc2": 0.6120244185533295,
"mc2_stderr": 0.015634404794374702
},
"harness|arc:challenge|25": {
"acc": 0.6476109215017065,
"acc_stderr": 0.013960142600598672,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729129
},
"harness|hellaswag|10": {
"acc": 0.6896036646086438,
"acc_stderr": 0.004617103280372032,
"acc_norm": 0.8683529177454691,
"acc_norm_stderr": 0.0033741568675916727
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554963,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.02934457250063434,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.02934457250063434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8416347381864623,
"acc_stderr": 0.013055346753516729,
"acc_norm": 0.8416347381864623,
"acc_norm_stderr": 0.013055346753516729
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050876,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050876
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897227,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897227
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140357,
"mc2": 0.6120244185533295,
"mc2_stderr": 0.015634404794374702
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491908
},
"harness|gsm8k|5": {
"acc": 0.6762699014404853,
"acc_stderr": 0.012888247397371143
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ai2lumos/lumos_complex_qa_ground_onetime | ---
license: apache-2.0
task_categories:
- text-generation
- question-answering
language:
- en
tags:
- language-agent
- reasoning
- question-answering
- grounding
size_categories:
- 10K<n<100K
---
# 🪄 Agent Lumos: Unified and Modular Training for Open-Source Language Agents
<p align="center">
🌐<a href="https://allenai.github.io/lumos">[Website]</a>
📝<a href="https://arxiv.org/abs/2311.05657">[Paper]</a>
🤗<a href="https://huggingface.co/datasets?sort=trending&search=ai2lumos">[Data]</a>
🤗<a href="https://huggingface.co/models?sort=trending&search=ai2lumos">[Model]</a>
🤗<a href="https://huggingface.co/spaces/ai2lumos/lumos_data_demo">[Demo]</a>
</p>
We introduce 🪄**Lumos**, Language Agents with **Unified** Formats, **Modular** Design, and **Open-Source** LLMs. **Lumos** unifies a suite of complex interactive tasks and achieves competitive performance with GPT-4/3.5-based and larger open-source agents.
**Lumos** has following features:
* 🧩 **Modular Architecture**:
- 🧩 **Lumos** consists of planning, grounding, and execution modules built based on LLAMA-2-7B/13B and off-the-shelf APIs.
- 🤗 **Lumos** utilizes a unified data format that encompasses multiple task types, thereby enabling the developed agent framework to conveniently support a range of interactive tasks.
* 🌍 **Diverse Training Data**:
- 🌍 **Lumos** is trained with ~56K diverse high-quality subgoal/action annotations from ground-truth reasoning steps in existing benchmarks with GPT-4.
- ⚒️ **Lumos** data can be instrumental for future research in developing open-source agents for complex interactive tasks.
* 🚀 **Competitive Performance**:
- 🚀 **Lumos** is comparable or even beats **GPT-series** agents on web/complex QA tasks Mind2Web and HotpotQA, and **larger open agents** on math and multimodal tasks.
- 🚀 **Lumos** exceeds contemporaneous agents that have been **fine-tuned** with in-domain HotpotQA, Mind2Web and ScienceQA annotations, such as **FiReAct**, **AgentLM**, and **AutoAct**.
- 🚀 **Lumos** performs better than open agent baseline formulations including **chain-of-thoughts** and **integrated** training.
- 🚀 **Lumos** surpasses larger open LLM agents and domain-specific agents on unseen tasks, WebShop and InterCode_SQL.
## Data Overview
`lumos_complex_qa_ground_onetime` is the data for training **grounding** module on **complex QA** task in **Lumos-Onetime (Lumos-O)** formulation.
The source of the training annotation training data is shown below:
| Datasets | Number |
|---|---|
|StrategyQA|1777|
|Musique|17632|
## Models Trained with the Data
`lumos_complex_qa_ground_onetime` is used to train the following models.
|Model|Huggingface Repo|
|---|---|
|`lumos_complex_qa_ground_onetime`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_complex_qa_ground_onetime) |
## Citation
If you find this work is relevant with your research, please feel free to cite our work!
```
@article{yin2023lumos,
title={Agent Lumos: Unified and Modular Training for Open-Source Language Agents},
author={Yin, Da and Brahman, Faeze and Ravichander, Abhilasha and Chandu, Khyathi and Chang, Kai-Wei and Choi, Yejin and Lin, Bill Yuchen},
journal={arXiv preprint arXiv:2311.05657},
year={2023}
}
``` |
Oivalf23/Jhow | ---
license: openrail
---
|
alaeddinehamroun/invoices-donut-data-v1 | ---
dataset_info:
features:
- name: pixel_values
sequence:
sequence:
sequence: float32
- name: labels
sequence: int64
- name: target_sequence
dtype: string
splits:
- name: train
num_bytes: 2243764243.8
num_examples: 270
- name: test
num_bytes: 249307138.2
num_examples: 30
download_size: 176117453
dataset_size: 2493071382.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
assmosis/apassf | ---
license: creativeml-openrail-m
---
|
muhtasham/tj-corpus-poem | ---
license: apache-2.0
task_categories:
- text-generation
language:
- tg
--- |
lsb/poetaexmachina-recitations-milli-onegrams | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 7916446.0
num_examples: 177
download_size: 6517487
dataset_size: 7916446.0
---
# Dataset Card for "poetaexmachina-recitations-milli-onegrams"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jisukim8873__mistral-7B-alpaca-case-2-2 | ---
pretty_name: Evaluation run of jisukim8873/mistral-7B-alpaca-case-2-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jisukim8873/mistral-7B-alpaca-case-2-2](https://huggingface.co/jisukim8873/mistral-7B-alpaca-case-2-2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jisukim8873__mistral-7B-alpaca-case-2-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T16:05:16.716909](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__mistral-7B-alpaca-case-2-2/blob/main/results_2024-04-02T16-05-16.716909.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6175614052653129,\n\
\ \"acc_stderr\": 0.03279810871327092,\n \"acc_norm\": 0.6249501315552384,\n\
\ \"acc_norm_stderr\": 0.03348050134869184,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.01572313952460876,\n \"mc2\": 0.451692394438764,\n\
\ \"mc2_stderr\": 0.014487915528150938\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6362278430591516,\n\
\ \"acc_stderr\": 0.00480100965769044,\n \"acc_norm\": 0.8327026488747261,\n\
\ \"acc_norm_stderr\": 0.0037247833892533225\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099583,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099583\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n \
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440679,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440679\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n\
\ \"acc_stderr\": 0.02556060472102289,\n \"acc_norm\": 0.7193548387096774,\n\
\ \"acc_norm_stderr\": 0.02556060472102289\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723872,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723872\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n\
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.0170905738042179,\n \"acc_norm\"\
: 0.8018348623853211,\n \"acc_norm_stderr\": 0.0170905738042179\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n\
\ \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994927,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381396,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381396\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.02475241196091721,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.02475241196091721\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3575418994413408,\n\
\ \"acc_stderr\": 0.016029394474894886,\n \"acc_norm\": 0.3575418994413408,\n\
\ \"acc_norm_stderr\": 0.016029394474894886\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137904,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137904\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.02623696588115327,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.02623696588115327\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.012671902782567648,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.012671902782567648\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681404,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681404\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487032,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487032\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.01572313952460876,\n \"mc2\": 0.451692394438764,\n\
\ \"mc2_stderr\": 0.014487915528150938\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.26838514025777105,\n \
\ \"acc_stderr\": 0.012205702688013674\n }\n}\n```"
repo_url: https://huggingface.co/jisukim8873/mistral-7B-alpaca-case-2-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|arc:challenge|25_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|gsm8k|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hellaswag|10_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T16-05-16.716909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T16-05-16.716909.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- '**/details_harness|winogrande|5_2024-04-02T16-05-16.716909.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T16-05-16.716909.parquet'
- config_name: results
data_files:
- split: 2024_04_02T16_05_16.716909
path:
- results_2024-04-02T16-05-16.716909.parquet
- split: latest
path:
- results_2024-04-02T16-05-16.716909.parquet
---
# Dataset Card for Evaluation run of jisukim8873/mistral-7B-alpaca-case-2-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jisukim8873/mistral-7B-alpaca-case-2-2](https://huggingface.co/jisukim8873/mistral-7B-alpaca-case-2-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jisukim8873__mistral-7B-alpaca-case-2-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T16:05:16.716909](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__mistral-7B-alpaca-case-2-2/blob/main/results_2024-04-02T16-05-16.716909.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6175614052653129,
"acc_stderr": 0.03279810871327092,
"acc_norm": 0.6249501315552384,
"acc_norm_stderr": 0.03348050134869184,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.01572313952460876,
"mc2": 0.451692394438764,
"mc2_stderr": 0.014487915528150938
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.014070265519268802
},
"harness|hellaswag|10": {
"acc": 0.6362278430591516,
"acc_stderr": 0.00480100965769044,
"acc_norm": 0.8327026488747261,
"acc_norm_stderr": 0.0037247833892533225
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099583,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099583
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440679,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440679
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.02556060472102289,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.02556060472102289
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723872,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723872
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.0170905738042179,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.0170905738042179
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994927,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381396,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381396
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3575418994413408,
"acc_stderr": 0.016029394474894886,
"acc_norm": 0.3575418994413408,
"acc_norm_stderr": 0.016029394474894886
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137904,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115327,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115327
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567648,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567648
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681404,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681404
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487032,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487032
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.01572313952460876,
"mc2": 0.451692394438764,
"mc2_stderr": 0.014487915528150938
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126735
},
"harness|gsm8k|5": {
"acc": 0.26838514025777105,
"acc_stderr": 0.012205702688013674
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
2A2I/Arabic-OpenHermes-2.5 | ---
language:
- ar
license: apache-2.0
size_categories:
- 100K<n<1M
dataset_info:
features:
- name: title
dtype: string
- name: category
dtype: string
- name: system_prompt
dtype: string
- name: topic
dtype: string
- name: avatarUrl
dtype: string
- name: model
dtype: string
- name: hash
dtype: string
- name: skip_prompt_formatting
dtype: bool
- name: custom_instruction
dtype: bool
- name: idx
dtype: string
- name: language
dtype: string
- name: views
dtype: float64
- name: source
dtype: string
- name: model_name
dtype: string
- name: id
dtype: string
- name: user
dtype: string
- name: gpt
dtype: string
- name: conversations
dtype: string
splits:
- name: train
num_bytes: 3878191096
num_examples: 981618
download_size: 1685705250
dataset_size: 3878191096
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- synthetic
- GPT-4
- Distillation
- Compilation
---
# Dataset Card for "Arabic-OpenHermes-2.5"
<img src="./Arabic-OpenHermes-2.5.png" width="350" alt="Original Dataset Card of Arabic-OpenHermes-2.5 by 2A2I">
### Dataset Sources & Infos
- **Data Origin**: Derived from the original OpenHermes dataset : [teknium/OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5).
- **Languages**: Modern Standard Arabic (MSA)
- **Applications**: `Language Modeling`
- **Maintainer:** [Marwa El Kamil](https://huggingface.co/maghwa) & [Mohammed Machrouh](https://huggingface.co/medmac01)
- **License:** Apache-2.0
### Overview
`Arabic-OpenHermes-2.5` is a carefully curated dataset extracted / translated from the OpenHermes-2.5 collection provided by [teknium](https://huggingface.co/teknium).
### Purpose
`Arabic-OpenHermes-2.5` streamlines Arabic language research and applications by offering a high quality text resource in the conversational style to help better alignement of the Arabic Base LLMs, saving time and effort for researchers, technologists, and linguists in Arabic NLP/AI projects.
- Enjoy using Arabic-OpenHermes-2.5 dataset directly for your Arabic applications and research! 😀
### Usage
This dataset serves as an essential tool for those venturing into Arabic language projects, spanning from academic research to commercial applications. By presenting a source of Arabic text, `Arabic-OpenHermes-2.5` empowers users to plunge directly into model `finetuning`, analysis, and application development, eliminating the initial challenges of synthetic data creation.
#### Use with HuggingFace
To load this dataset with Datasets, you'll need to install the datasets library with `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
dataset = load_dataset("2A2I/Arabic-OpenHermes-2.5")
```
### Contribution and Collaborative Engagement
Find 'Arabic-OpenHermes-2.5' on the Hugging Face Hub at [2A2I/Arabic-OpenHermes-2.5](https://huggingface.co/datasets/2A2I/Arabic-OpenHermes-2.5), where community contributions are welcomed. Users are invited to share feedback and propose enhancements.
### Support and Collaborate
We are dedicated to cultivating an inclusive and encouraging space for Arabic AI and NLP research. For assistance, collaboration opportunities, or inquiries related to the dataset, please connect with us through the Hugging Face Hub's discussion section or contact us via [2A2I Contact Email](arabic.ai.initiative@gmail.com).
---
# Original Dataset Card of OpenHermes-2.5 by teknium
<img src="https://cdn-uploads.huggingface.co/production/uploads/64d5698102e58cc1fdd0b585/nWQ7oqq4fUSaGsvmNAsr2.png" width="350" alt="Original Dataset Card of OpenHermes by teknium">
## Dataset Summary
The Open Hermes 2/2.5 and Nous Hermes 2 models have recently achieved noteworthy progress in state-of-the-art language models (LLMs). These advancements are rooted in the innovative utilization of large-scale training data, specifically tailored for language modeling tasks.
For further information, please visit [teknium/OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5).
We hope the `Arabic-OpenHermes-2.5` dataset serves your needs well and propels your Arabic NLP endeavors to new heights!
## Citation
```bibtex
@misc{OpenHermes 2.5,
title = {OpenHermes 2.5: An Open Dataset of Synthetic Data for Generalist LLM Assistants},
author = {Teknium},
year = {2023},
publisher = {HuggingFace},
url = {https://huggingface.co/datasets/teknium/OpenHermes-2.5}
}
```
```bibtex
@misc{Arabic OpenHermes 2.5,
title = {Arabic OpenHermes 2.5: An Arabic version of Synthetic Data for Generalist Arabic LLM Assistants},
author = {Marwa El Kamil, Mohammed Machrouh},
year = {2024},
publisher = {HuggingFace},
url = {https://huggingface.co/datasets/2A2I/Arabic-OpenHermes-2.5}
}
```
|
firtiadi19/Database | ---
license: apache-2.0
task_categories:
- text-classification
- token-classification
--- |
sakshat98/mistral_data | ---
license: apache-2.0
---
|
WillyArdiyanto/12-cat-breed-OxfordIIIT | ---
license: apache-2.0
---
|
baohuynhbk14/vin100h-preprocessed-16k-whisper-large-v3 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 11612168195.068806
num_examples: 55298
- name: test
num_bytes: 233579703.02419397
num_examples: 1129
download_size: 11655040082
dataset_size: 11845747898.093
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_YeungNLP__firefly-gemma-7b | ---
pretty_name: Evaluation run of YeungNLP/firefly-gemma-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YeungNLP/firefly-gemma-7b](https://huggingface.co/YeungNLP/firefly-gemma-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-gemma-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T01:53:59.085722](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-gemma-7b/blob/main/results_2024-03-01T01-53-59.085722.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6152646951811004,\n\
\ \"acc_stderr\": 0.03288154638232489,\n \"acc_norm\": 0.6188816393957486,\n\
\ \"acc_norm_stderr\": 0.03353929852447101,\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.49407922997214876,\n\
\ \"mc2_stderr\": 0.015395102370742033\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000324\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6016729735112527,\n\
\ \"acc_stderr\": 0.004885529674958336,\n \"acc_norm\": 0.7977494523003386,\n\
\ \"acc_norm_stderr\": 0.004008571431483689\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880274,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.02366421667164251,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.02366421667164251\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218964,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218964\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130952,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130952\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0287420409039485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0287420409039485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425173,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425173\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973147,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973147\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.025469770149400175,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.025469770149400175\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475356,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475356\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.0269256546536157,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.0269256546536157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035468,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035468\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.012687818419599924,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.012687818419599924\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6601307189542484,\n \"acc_stderr\": 0.01916241858862356,\n \
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.01916241858862356\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.49407922997214876,\n\
\ \"mc2_stderr\": 0.015395102370742033\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7545382794001578,\n \"acc_stderr\": 0.012095272937183649\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4927975739196361,\n \
\ \"acc_stderr\": 0.013771055751972872\n }\n}\n```"
repo_url: https://huggingface.co/YeungNLP/firefly-gemma-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|arc:challenge|25_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|gsm8k|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hellaswag|10_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-53-59.085722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T01-53-59.085722.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- '**/details_harness|winogrande|5_2024-03-01T01-53-59.085722.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T01-53-59.085722.parquet'
- config_name: results
data_files:
- split: 2024_03_01T01_53_59.085722
path:
- results_2024-03-01T01-53-59.085722.parquet
- split: latest
path:
- results_2024-03-01T01-53-59.085722.parquet
---
# Dataset Card for Evaluation run of YeungNLP/firefly-gemma-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-gemma-7b](https://huggingface.co/YeungNLP/firefly-gemma-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-gemma-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T01:53:59.085722](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-gemma-7b/blob/main/results_2024-03-01T01-53-59.085722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6152646951811004,
"acc_stderr": 0.03288154638232489,
"acc_norm": 0.6188816393957486,
"acc_norm_stderr": 0.03353929852447101,
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.49407922997214876,
"mc2_stderr": 0.015395102370742033
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000324
},
"harness|hellaswag|10": {
"acc": 0.6016729735112527,
"acc_stderr": 0.004885529674958336,
"acc_norm": 0.7977494523003386,
"acc_norm_stderr": 0.004008571431483689
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880274,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.02366421667164251,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.02366421667164251
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218964,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218964
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130952,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130952
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0287420409039485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0287420409039485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425173,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425173
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973147,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973147
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.025469770149400175,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.025469770149400175
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475356,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475356
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.0269256546536157,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.0269256546536157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035468,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035468
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599924,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599924
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.01916241858862356,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.01916241858862356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.49407922997214876,
"mc2_stderr": 0.015395102370742033
},
"harness|winogrande|5": {
"acc": 0.7545382794001578,
"acc_stderr": 0.012095272937183649
},
"harness|gsm8k|5": {
"acc": 0.4927975739196361,
"acc_stderr": 0.013771055751972872
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_chargoddard__average-dolphin-8x7B | ---
pretty_name: Evaluation run of chargoddard/average-dolphin-8x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/average-dolphin-8x7B](https://huggingface.co/chargoddard/average-dolphin-8x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__average-dolphin-8x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T07:27:11.992896](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__average-dolphin-8x7B/blob/main/results_2024-01-06T07-27-11.992896.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7063086568039458,\n\
\ \"acc_stderr\": 0.030431562081853093,\n \"acc_norm\": 0.7105668691204242,\n\
\ \"acc_norm_stderr\": 0.03102008354677035,\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5450592796762483,\n\
\ \"mc2_stderr\": 0.015077693207960957\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6552901023890785,\n \"acc_stderr\": 0.01388881628678211,\n\
\ \"acc_norm\": 0.6860068259385665,\n \"acc_norm_stderr\": 0.0135626912247263\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6726747659828719,\n\
\ \"acc_stderr\": 0.004682780790508314,\n \"acc_norm\": 0.8598884684325832,\n\
\ \"acc_norm_stderr\": 0.0034639332860638833\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.0327900040631005,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.0327900040631005\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.0311648996669486,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.0311648996669486\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.049665709039785295,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.049665709039785295\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6578947368421053,\n\
\ \"acc_stderr\": 0.044629175353369376,\n \"acc_norm\": 0.6578947368421053,\n\
\ \"acc_norm_stderr\": 0.044629175353369376\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.025751310131230234,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.025751310131230234\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n\
\ \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.8387096774193549,\n \"acc_stderr\": 0.020923327006423298,\n\
\ \"acc_norm\": 0.8387096774193549,\n \"acc_norm_stderr\": 0.020923327006423298\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5911330049261084,\n \"acc_stderr\": 0.03459058815883232,\n \"\
acc_norm\": 0.5911330049261084,\n \"acc_norm_stderr\": 0.03459058815883232\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.03123475237772117,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.03123475237772117\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8484848484848485,\n\
\ \"acc_stderr\": 0.02554565042660362,\n \"acc_norm\": 0.8484848484848485,\n\
\ \"acc_norm_stderr\": 0.02554565042660362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607555,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607555\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.0231193627582323,\n \
\ \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.0231193627582323\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.026265024608275882,\n\
\ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.026265024608275882\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8752293577981651,\n \"acc_stderr\": 0.014168298359156336,\n \"\
acc_norm\": 0.8752293577981651,\n \"acc_norm_stderr\": 0.014168298359156336\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8725490196078431,\n \"acc_stderr\": 0.023405530480846315,\n \"\
acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.023405530480846315\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878463,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878463\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7399103139013453,\n\
\ \"acc_stderr\": 0.029442495585857483,\n \"acc_norm\": 0.7399103139013453,\n\
\ \"acc_norm_stderr\": 0.029442495585857483\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934725,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934725\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573973,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.01872430174194165,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.01872430174194165\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8876117496807152,\n\
\ \"acc_stderr\": 0.011294541351216537,\n \"acc_norm\": 0.8876117496807152,\n\
\ \"acc_norm_stderr\": 0.011294541351216537\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071134,\n\
\ \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071134\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45027932960893857,\n\
\ \"acc_stderr\": 0.01663961523684581,\n \"acc_norm\": 0.45027932960893857,\n\
\ \"acc_norm_stderr\": 0.01663961523684581\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8006535947712419,\n \"acc_stderr\": 0.02287581699346407,\n\
\ \"acc_norm\": 0.8006535947712419,\n \"acc_norm_stderr\": 0.02287581699346407\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7781350482315113,\n\
\ \"acc_stderr\": 0.02359885829286305,\n \"acc_norm\": 0.7781350482315113,\n\
\ \"acc_norm_stderr\": 0.02359885829286305\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8209876543209876,\n \"acc_stderr\": 0.02133086876212706,\n\
\ \"acc_norm\": 0.8209876543209876,\n \"acc_norm_stderr\": 0.02133086876212706\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5531914893617021,\n \"acc_stderr\": 0.02965823509766691,\n \
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.02965823509766691\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5267275097783573,\n\
\ \"acc_stderr\": 0.012751977967676005,\n \"acc_norm\": 0.5267275097783573,\n\
\ \"acc_norm_stderr\": 0.012751977967676005\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7683823529411765,\n \"acc_stderr\": 0.025626533803777562,\n\
\ \"acc_norm\": 0.7683823529411765,\n \"acc_norm_stderr\": 0.025626533803777562\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.761437908496732,\n \"acc_stderr\": 0.0172423858287796,\n \
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.0172423858287796\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7959183673469388,\n \"acc_stderr\": 0.025801283475090496,\n\
\ \"acc_norm\": 0.7959183673469388,\n \"acc_norm_stderr\": 0.025801283475090496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900798,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.02353755765789256,\n\
\ \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.02353755765789256\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5450592796762483,\n\
\ \"mc2_stderr\": 0.015077693207960957\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.01094187795567621\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5655799848369977,\n \
\ \"acc_stderr\": 0.013653507211411417\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/average-dolphin-8x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|arc:challenge|25_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|gsm8k|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hellaswag|10_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T07-27-11.992896.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T07-27-11.992896.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- '**/details_harness|winogrande|5_2024-01-06T07-27-11.992896.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T07-27-11.992896.parquet'
- config_name: results
data_files:
- split: 2024_01_06T07_27_11.992896
path:
- results_2024-01-06T07-27-11.992896.parquet
- split: latest
path:
- results_2024-01-06T07-27-11.992896.parquet
---
# Dataset Card for Evaluation run of chargoddard/average-dolphin-8x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chargoddard/average-dolphin-8x7B](https://huggingface.co/chargoddard/average-dolphin-8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__average-dolphin-8x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T07:27:11.992896](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__average-dolphin-8x7B/blob/main/results_2024-01-06T07-27-11.992896.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7063086568039458,
"acc_stderr": 0.030431562081853093,
"acc_norm": 0.7105668691204242,
"acc_norm_stderr": 0.03102008354677035,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.016965517578930354,
"mc2": 0.5450592796762483,
"mc2_stderr": 0.015077693207960957
},
"harness|arc:challenge|25": {
"acc": 0.6552901023890785,
"acc_stderr": 0.01388881628678211,
"acc_norm": 0.6860068259385665,
"acc_norm_stderr": 0.0135626912247263
},
"harness|hellaswag|10": {
"acc": 0.6726747659828719,
"acc_stderr": 0.004682780790508314,
"acc_norm": 0.8598884684325832,
"acc_norm_stderr": 0.0034639332860638833
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.0327900040631005,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.0327900040631005
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.0311648996669486,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.0311648996669486
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.044629175353369376,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.044629175353369376
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5,
"acc_stderr": 0.025751310131230234,
"acc_norm": 0.5,
"acc_norm_stderr": 0.025751310131230234
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8387096774193549,
"acc_stderr": 0.020923327006423298,
"acc_norm": 0.8387096774193549,
"acc_norm_stderr": 0.020923327006423298
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5911330049261084,
"acc_stderr": 0.03459058815883232,
"acc_norm": 0.5911330049261084,
"acc_norm_stderr": 0.03459058815883232
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.03123475237772117,
"acc_norm": 0.8,
"acc_norm_stderr": 0.03123475237772117
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.02554565042660362,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.02554565042660362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607555,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607555
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.0231193627582323,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.0231193627582323
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.026265024608275882,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.026265024608275882
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8752293577981651,
"acc_stderr": 0.014168298359156336,
"acc_norm": 0.8752293577981651,
"acc_norm_stderr": 0.014168298359156336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.023405530480846315,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.023405530480846315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878463,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878463
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7399103139013453,
"acc_stderr": 0.029442495585857483,
"acc_norm": 0.7399103139013453,
"acc_norm_stderr": 0.029442495585857483
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934725,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934725
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573973,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.01872430174194165,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.01872430174194165
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8876117496807152,
"acc_stderr": 0.011294541351216537,
"acc_norm": 0.8876117496807152,
"acc_norm_stderr": 0.011294541351216537
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7658959537572254,
"acc_stderr": 0.022797110278071134,
"acc_norm": 0.7658959537572254,
"acc_norm_stderr": 0.022797110278071134
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45027932960893857,
"acc_stderr": 0.01663961523684581,
"acc_norm": 0.45027932960893857,
"acc_norm_stderr": 0.01663961523684581
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8006535947712419,
"acc_stderr": 0.02287581699346407,
"acc_norm": 0.8006535947712419,
"acc_norm_stderr": 0.02287581699346407
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7781350482315113,
"acc_stderr": 0.02359885829286305,
"acc_norm": 0.7781350482315113,
"acc_norm_stderr": 0.02359885829286305
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8209876543209876,
"acc_stderr": 0.02133086876212706,
"acc_norm": 0.8209876543209876,
"acc_norm_stderr": 0.02133086876212706
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.02965823509766691,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.02965823509766691
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5267275097783573,
"acc_stderr": 0.012751977967676005,
"acc_norm": 0.5267275097783573,
"acc_norm_stderr": 0.012751977967676005
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7683823529411765,
"acc_stderr": 0.025626533803777562,
"acc_norm": 0.7683823529411765,
"acc_norm_stderr": 0.025626533803777562
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.0172423858287796,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.0172423858287796
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7959183673469388,
"acc_stderr": 0.025801283475090496,
"acc_norm": 0.7959183673469388,
"acc_norm_stderr": 0.025801283475090496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900798,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.02353755765789256,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.02353755765789256
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.016965517578930354,
"mc2": 0.5450592796762483,
"mc2_stderr": 0.015077693207960957
},
"harness|winogrande|5": {
"acc": 0.813733228097869,
"acc_stderr": 0.01094187795567621
},
"harness|gsm8k|5": {
"acc": 0.5655799848369977,
"acc_stderr": 0.013653507211411417
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ErikQQY/test | ---
license: mit
task_categories:
- text-generation
language:
- zh
pretty_name: Standards
size_categories:
- 100K<n<1M
tags:
- code
--- |
AdapterOcean/med_alpaca_standardized_cluster_67 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 78542435
num_examples: 7727
download_size: 24649167
dataset_size: 78542435
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_67"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Hatefulmemes_validation_google_flan_t5_xxl_mode_C_HM_A_T_OCR_rices_ns_500 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 593208
num_examples: 500
download_size: 0
dataset_size: 593208
---
# Dataset Card for "Hatefulmemes_validation_google_flan_t5_xxl_mode_C_HM_A_T_OCR_rices_ns_500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/tldr_17_3k | ---
dataset_info:
features:
- name: author
dtype: string
- name: body
dtype: string
- name: normalizedBody
dtype: string
- name: subreddit
dtype: string
- name: subreddit_id
dtype: string
- name: id
dtype: string
- name: content
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 14761884.702975057
num_examples: 3000
download_size: 9479190
dataset_size: 14761884.702975057
---
# Dataset Card for "tldr_17_3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shaheenahmedc/goal_captions | ---
license: unknown
---
|
AyoubChLin/Cnn_news_article_Sembedding | ---
license: apache-2.0
---
|
eco4cast/neon4cast-scores | ---
license: cc0-1.0
tags:
- climate
- biology
---
Snapshot of the Ecological Forecasting Initiative NEON Forecasting Challenge
Includes probabilistic forecasts, observations, and skill scores across all submitted forecasts over 5 challenge themes.
|
rwq-elo/rwq-answers | ---
language:
- en
license: cc-by-nc-4.0
---
# RWQ-Answers Dataset
This dataset containes answers of popular 24 LLMs to [RWQ](https://huggingface.co/datasets/rwq-elo/rwq-questions) 20,772 questions.
Some cells could be empty, because online model reject to answer by policy or empty answer generated by local model.
## Model List
| model |
| --- |
| gpt-4-turbo |
| gpt-35-turbo |
| lmsys/vicuna-7b-v1.5 |
| lmsys/vicuna-13b-v1.5 |
| lmsys/vicuna-33b-v1.3 |
| meta-llama/Llama-2-7b-chat-hf |
| meta-llama/Llama-2-13b-chat-hf |
| meta-llama/Llama-2-70b-chat-hf |
| chavinlo/alpaca-native |
| chavinlo/alpaca-13b |
| mosaicml/mpt-7b-chat |
| mosaicml/mpt-30b-chat |
| WizardLM/WizardLM-7B-V1.0 |
| WizardLM/WizardLM-13B-V1.2 |
| Xwin-LM/Xwin-LM-7B-V0.1 |
| Xwin-LM/Xwin-LM-13B-V0.1 |
| tiiuae/falcon-7b-instruct |
| tiiuae/falcon-40b-instruct |
| HuggingFaceH4/zephyr-7b-beta |
| huggyllama/llama-7b |
| huggyllama/llama-13b |
| huggyllama/llama-30b |
| gemini |
| mistralai/Mixtral-8x7B-Instruct-v0.1 |
## Citation
TODO |
BEE-spoke-data/falcon-refinedweb-100k_en_med-sample | ---
language:
- en
license: odc-by
size_categories:
- 10K<n<100K
source_datasets: tiiuae/falcon-refinedweb
task_categories:
- text-generation
dataset_info:
- config_name: default
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 485240640
num_examples: 100000
download_size: 299772551
dataset_size: 485240640
- config_name: embeddings-text-nomic_text_v1
features:
- name: text
dtype: string
- name: text-embedding
sequence: float64
splits:
- name: train
num_bytes: 1100040640
num_examples: 100000
download_size: 802872607
dataset_size: 1100040640
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: embeddings-text-nomic_text_v1
data_files:
- split: train
path: embeddings-text-nomic_text_v1/train-*
---
# BEE-spoke-data/falcon-refinedweb-100k_en_med-sample
A sample from [falcon-refinedweb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb):
- more than 512 & less than 8192 llama2 tokens
- `en` only (via fasttext-langdetect)
- 100k samples |
esteler-ai/idn-news-az | ---
license: cc-by-4.0
task_categories:
- text-generation
- fill-mask
language:
- id
pretty_name: a
size_categories:
- 1M<n<10M
---
The dataset are collected by scrapping the Indonesian news portal as follow:
* detik.com
* suara.com
* cnnindonesia.com
* kompas.com
* kontan.co.id
* bisnis.com
* investor.id
* mojok.co
* cnbcindonesia.com
* cnnindonesia.com
* sindonews.com
* tribunnews.com
* okezone.com
* tempo.co.id
* vivanews.co.id
* antaranews.com
* metronews.com
The corpus is collected from the news of Jan 2023 to Oct 2023, but there are a few of them are written in 2022. |
captian-seal/models | ---
license: openrail
---
|
kish786/chatbot | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 400
num_examples: 100
download_size: 715
dataset_size: 400
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
result-kand2-sdxl-wuerst-karlo/7162bca1 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 227
num_examples: 10
download_size: 1422
dataset_size: 227
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "7162bca1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ReyDev/ws-resolutions | ---
license: mit
task_categories:
- text-classification
pretty_name: Ws Resolutions
size_categories:
- 100K<n<1M
--- |
JnKamasMTS/third_test | ---
license: mit
---
|
chenxxiao/lyfpictures | ---
license: apache-2.0
---
|
neibla/debates | ---
license: mit
---
|
paduraru2009/imdb-sample | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 40027944
num_examples: 30000
- name: validation
num_bytes: 39047740
num_examples: 30000
download_size: 50419531
dataset_size: 79075684
---
# Dataset Card for "imdb-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fahmiaziz/alpaca-new | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2010022
num_examples: 2000
download_size: 1242409
dataset_size: 2010022
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sambob/fscoco_sketch | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: guide
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 5215215630.0
num_examples: 10000
download_size: 5215442232
dataset_size: 5215215630.0
---
# Dataset Card for "fscoco_sketch"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davmel/ka_homonym_disambiguation | ---
license: mit
task_categories:
- text-classification
language:
- ka
size_categories:
- 1M<n<10M
---
# Georgian-Homonym-Disambiguation
This repository contains all the datasets for the Georgian homonym disambiguation task.
For more specific details you can read my <a href="https://github.com/davmels/Georgian-Homonym-Disambiguation/blob/main/homonym_disambiguation_in_the_georgian_language.pdf">article</a>
## Dataset
At this point I've considered only the homonym: "ბარი" and it's different grammatical forms obtaining 7522 sentences.
The "dataset.parquet" includes:
- 763 sentences using "ბარი" as a "shovel" labaled with 0
- 1846 sentences using "ბარი" as a "lowland" labeld with 1
- 3320 sentences using "ბარი" as a "cafe" labeled with 2
- 1593 sentences where the homonym is used in a different context, labeled with 3 (Although these sentences could be further classified by the definitions of the homonyms, for this project I've ignored other usages).
the column 'homonym_index' contains the index of the homonym in the sentence, that is, the index of the word which is the homonym.
The "full-homonym-sentences-ბარ.txt" includes the sentences which contain the homonym "ბარი" and it's various grammatical forms. These sentences were limited to a
maximum length of 13 words, with the homonym positioned in the middle of each sentence. They are around 28000 and are not labelled. |
TrainingDataPro/customers-reviews-on-banks | ---
license: cc-by-nc-nd-4.0
task_categories:
- text-classification
language:
- en
tags:
- code
- finance
---
# Customers Reviews on Banks ⭐️
The Reviews on Banks Dataset is a comprehensive collection of **20,000** the most recent customer reviews on **48** US banks.
This dataset containing diverse reviews on multiple banks, can be useful for *sentiment analysis, assessing geographical variations in customer satisfaction, and exploring customer preferences through textual data*.
Understanding customer sentiments and preferences helps **banks** improve their services and address any issues raised by customers in their reviews.
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=customers-reviews-on-banks) to discuss your requirements, learn about the price and buy the dataset.
# Content
For each item, we extracted:
- **author**: name of the reviewer,
- **date**: date of the review,
- **location**: location of the reviewer,
- **bank**: bank which is reviewed
- **star**: number of stars given to the bank by the reviewer,
- **text**: text of the review,
- **like**: number of likes on the review
## [**TrainingData**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=customers-reviews-on-banks) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
AlekseyKorshuk/davinci-pairwise-tokenized | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 2530035200
num_examples: 64759
- name: test
num_bytes: 36178476
num_examples: 7195
download_size: 0
dataset_size: 2566213676
---
# Dataset Card for "davinci-pairwise-tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tgokhale/vqa_lol | ---
license: cc-by-nc-nd-4.0
---
|
ovior/twitter_dataset_1713007213 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2471691
num_examples: 7287
download_size: 1421790
dataset_size: 2471691
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
freshpearYoon/train_free_51 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604549616
num_examples: 10000
download_size: 1233012485
dataset_size: 9604549616
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
percins/IN-ABS | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: file
dtype: string
splits:
- name: train
num_bytes: 160084476
num_examples: 5346
- name: validation
num_bytes: 22684426
num_examples: 712
- name: test
num_bytes: 30578218
num_examples: 1070
download_size: 103908520
dataset_size: 213347120
---
# Dataset Card for "IN-ABS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KAUE24122023/SimonyCantora | ---
license: openrail
---
|
open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V2.0 | ---
pretty_name: Evaluation run of elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0](https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V2.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T22:40:41.083519](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V2.0/blob/main/results_2023-10-23T22-40-41.083519.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10864093959731544,\n\
\ \"em_stderr\": 0.0031868582704839116,\n \"f1\": 0.15248846476509997,\n\
\ \"f1_stderr\": 0.0032781428140160563,\n \"acc\": 0.4430471890103538,\n\
\ \"acc_stderr\": 0.011315335280186417\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.10864093959731544,\n \"em_stderr\": 0.0031868582704839116,\n\
\ \"f1\": 0.15248846476509997,\n \"f1_stderr\": 0.0032781428140160563\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1599696739954511,\n \
\ \"acc_stderr\": 0.010097377827752538\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.012533292732620296\n\
\ }\n}\n```"
repo_url: https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|arc:challenge|25_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T22_40_41.083519
path:
- '**/details_harness|drop|3_2023-10-23T22-40-41.083519.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T22-40-41.083519.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T22_40_41.083519
path:
- '**/details_harness|gsm8k|5_2023-10-23T22-40-41.083519.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T22-40-41.083519.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hellaswag|10_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T22_40_41.083519
path:
- '**/details_harness|winogrande|5_2023-10-23T22-40-41.083519.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T22-40-41.083519.parquet'
- config_name: results
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- results_2023-09-16T11-13-20.345757.parquet
- split: 2023_10_23T22_40_41.083519
path:
- results_2023-10-23T22-40-41.083519.parquet
- split: latest
path:
- results_2023-10-23T22-40-41.083519.parquet
---
# Dataset Card for Evaluation run of elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0](https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T22:40:41.083519](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V2.0/blob/main/results_2023-10-23T22-40-41.083519.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.10864093959731544,
"em_stderr": 0.0031868582704839116,
"f1": 0.15248846476509997,
"f1_stderr": 0.0032781428140160563,
"acc": 0.4430471890103538,
"acc_stderr": 0.011315335280186417
},
"harness|drop|3": {
"em": 0.10864093959731544,
"em_stderr": 0.0031868582704839116,
"f1": 0.15248846476509997,
"f1_stderr": 0.0032781428140160563
},
"harness|gsm8k|5": {
"acc": 0.1599696739954511,
"acc_stderr": 0.010097377827752538
},
"harness|winogrande|5": {
"acc": 0.7261247040252565,
"acc_stderr": 0.012533292732620296
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
YaduvanshiAnkit/SanskritShloka2 | ---
license: mit
---
|
joey234/mmlu-logical_fallacies-original-neg | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 10758.742331288344
num_examples: 35
download_size: 8694
dataset_size: 10758.742331288344
---
# Dataset Card for "mmlu-logical_fallacies-original-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tuteldove/wine_review | ---
dataset_info:
features:
- name: wine_id
dtype: int64
- name: country
dtype: string
- name: description
dtype: string
- name: designation
dtype: string
- name: points
dtype: int64
- name: price
dtype: float64
splits:
- name: train
num_bytes: 21093175.17523332
num_examples: 68918
- name: test
num_bytes: 5273446.824766681
num_examples: 17230
download_size: 15113749
dataset_size: 26366622.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
the-glendalorian/filtered-data | ---
license: mit
---
|
ovior/twitter_dataset_1713022318 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2405708
num_examples: 7322
download_size: 1363404
dataset_size: 2405708
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
inkoziev/arithmetic | ---
language:
- ru
language_creators:
- machine-generated
license:
- cc-by-nc-4.0
multilinguality:
- monolingual
pretty_name: arithmetic
source_datasets: []
tags: []
task_categories:
- question-answering
task_ids:
- closed-domain-qa
---
# Арифметические задачи для диалоговой системы
Датасет содержит сэмплы с простыми математическими заданиями примерно такого вида:
```
- Фонарик Федора работает от 2 батареек, а фонарик Лехи от 6. Сколько батареек нужно фонарикам Федора и Лехи в сумме?
- 2+6=8, столько батареек потребуется.
- Теперь прибавь к результату 469, что получилось?
- 8 плюс 469 равно 477
- Подели на 53, что получилось?
- 9
```
Основная масса задач связана с арифметическими действиями. Есть некоторое количество задач
на поиск корней квадратного уравнения:
```
- Найди действительные корни квадратного уравнения a⋅x²+b⋅x+c для a=45, b=225, c=-270
- Тут два действительных корня -6 и 1
```
Также есть пополняемый набор задач с раскрытым ходом решения:
```
- В болотистых лесах проживает 8 сусликов. Охотник съедает по одному суслику каждые 9 дней. Сколько сусликов останется через 12 дней?
- За 12 дней охотник пообедает 1 раз. Поэтому останется 8-1=7 сусликов.
```
Некоторые задачи построены так, чтобы заставить модель обращать внимание не просто на
наличие чисел, а на контекст их употребления:
```
- Вика принесла в школу 5 мандаринов. Друзья попросили ее поделиться с ними мандаринами. Она отдала им 3 штуки. Сколько мандаринов Вика отдала?
- 3
```
Иногда числа в задаче не имеют отношения к сути задачи, что должно еще сильнее побуждать решающую модель учитывать контекст:
```
- Перемножив восемь и семь, учитель средней школы №77 получил 5084. Он верно посчитал?
- Учитель средней школы №77 ошибся, так как 8*7=56, а не 5084
```
## Формат данных
Каждый сэмпл содержит список связанных реплик без префикса "- ", образующих цепочку арифметических заданий, в которых
условие новой задачи требует анализа как минимум предыдущей реплики.
## Лексическая вариативность ответов
Для многих задач ответ сформулирован не просто как число, в него добавлен сопутствующий текст:
```
- Чему равно 2+2?
- 2+2 равно 4
```
## Метрики генеративных моделей
После файнтюна (1 эпоха, lr=1e-5) на 90% датасета, получаются такие метрики на тестовой части:
```
Модель Среднее отклонение числового ответа Доля верных ответов
в сравнении с верным
sberbank-ai/rugpt3small_based_on_gpt2 8.03e+02% 0.057
sberbank-ai/rugpt3medium_based_on_gpt2 2.89e+02% 0.085
sberbank-ai/rugpt3large_based_on_gpt2 1.58e+02% 0.131
facebook/xglm-2.9B 8.13e+02% 0.224
```
## Генератор сэмплов
При формировании датасета использовался движок шаблонной генерации из этого репозитория: [https://github.com/Koziev/math](https://github.com/Koziev/math).
## Использование датасета
Датасет используется для тренировки [чатбота](https://github.com/Koziev/chatbot).
|
mii-llm/gazzetta-ufficiale | ---
language:
- it
task_categories:
- text-generation
- fill-mask
pretty_name: Gazzetta Ufficiale
tags:
- law
- legal
dataset_info:
features:
- name: type
dtype: string
- name: year
dtype: string
- name: rubrica
dtype: string
- name: emettitore
dtype: string
- name: intestazione
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: note
dtype: string
- name: subtitle
dtype: string
- name: subsubtitle
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 6568988362
num_examples: 1425315
download_size: 3200520052
dataset_size: 6568988362
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
size_categories:
- 1M<n<10M
---
# Gazzetta Ufficiale 👩🏻⚖️⚖️🏛️📜🇮🇹

> La Gazzetta Ufficiale della Repubblica Italiana, quale fonte ufficiale di conoscenza delle norme in vigore in Italia e strumento di diffusione, informazione e ufficializzazione di testi legislativi, atti pubblici e privati, è edita dall’Istituto Poligrafico e Zecca dello Stato e pubblicata in collaborazione con il Ministero della Giustizia, il quale provvede alla direzione e redazione della stessa. L'Istituto Poligrafico e Zecca dello Stato S.p.A. promuove la più ampia fruibilità della Gazzetta Ufficiale della Repubblica Italiana in formato digitale. Si segnala che l'unico testo definitivo è quello pubblicato sulla Gazzetta Ufficiale a mezzo stampa, che prevale in caso di discordanza. La riproduzione dei testi forniti nel formato elettronico è consentita purché venga menzionata la fonte, il carattere non autentico e gratuito.
## TL;DR
*A dataset containing Italian legislative texts, public and private acts.*
## Sezioni
- **Parte Prima - Serie Generale**
In questa pubblicazione trovano posto tutti gli atti normativi ed amministrativi emanati dalle Amministrazioni centrali e periferiche dello Stato.
- **Corte Costituzionale (1ª Serie Speciale)**
Questa serie di G.U. riporta le Decisioni della "Corte Costituzionale" (Sentenze e Ordinanze) nonché gli Atti di promovimento rimessi al giudizio della Corte (Ricorsi, Ordinanze).
- **Regioni (3ª Serie Speciale)**
In questa pubblicazione trovano posto tutti gli atti normativi e amministrativi di interesse nazionale emanati dalle singole Regioni.
- **Concorsi (4ª Serie Speciale)**
Con questa pubblicazione viene data pubblicità ai concorsi, banditi dalle Amministrazioni centrali e periferiche dello Stato, ed a tutti gli avvisi funzionali all’espletamento degli stessi.
- **Contratti pubblici (5ª Serie Speciale)**
Questa pubblicazione, istituita nel 2007, ha lo scopo di dare pubblicità ai procedimenti di gara della pubblica amministrazione.
## Cite this dataset
```
@online{gazzetta,
author = {Federici, Edoardo and Ferraretto, Mattia and Landro, Nicola},
title = {{Gazzetta Ufficiale}: A Dataset of Legislative Texts, Public and Private Acts},
year = {2024},
url = {https://huggingface.co/datasets/mii-llm/gazzetta-ufficiale},
}
``` |
TKNodven/Mordredvoice2 | ---
license: openrail
---
|
valerievloef/Thesis_BERT | ---
license: apache-2.0
---
|
hanho/test | ---
license: openrail
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: pokemon
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 43
num_examples: 2
download_size: 0
dataset_size: 43
---
|
smallv0221/dd | ---
license: apache-2.0
---
|
multi-train/triviaqa-train-multikilt_1107 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: query
dtype: string
- name: pos
sequence: string
- name: neg
sequence: string
- name: task
dtype: string
- name: instruction
struct:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
splits:
- name: train
num_bytes: 67898183
num_examples: 52886
download_size: 39123463
dataset_size: 67898183
---
# Dataset Card for "triviaqa-train-multikilt_1107"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_234 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1136417284.0
num_examples: 223177
download_size: 1161610239
dataset_size: 1136417284.0
---
# Dataset Card for "chunk_234"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
whu9/ag_newskeywords_lem | ---
dataset_info:
features:
- name: keyword
dtype: string
- name: score
dtype: float64
- name: keyword_lem
dtype: string
splits:
- name: train
num_bytes: 50741
num_examples: 1830
download_size: 47664
dataset_size: 50741
---
# Dataset Card for "ag_newskeywords_lem"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmayhem93/top-n-reddit-corpus-55-cleaned | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4750853557
num_examples: 6141002
download_size: 2785435080
dataset_size: 4750853557
---
# Dataset Card for "top-n-reddit-corpus-55-cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
icewiny/blurred_image_coyo_1M | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: blurred_img
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 956490063.0
num_examples: 6000
download_size: 952136439
dataset_size: 956490063.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_ozayezerceli__TinyLlamax2-1.1b | ---
pretty_name: Evaluation run of ozayezerceli/TinyLlamax2-1.1b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ozayezerceli/TinyLlamax2-1.1b](https://huggingface.co/ozayezerceli/TinyLlamax2-1.1b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ozayezerceli__TinyLlamax2-1.1b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T22:12:18.361507](https://huggingface.co/datasets/open-llm-leaderboard/details_ozayezerceli__TinyLlamax2-1.1b/blob/main/results_2024-01-25T22-12-18.361507.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.265691244274486,\n\
\ \"acc_stderr\": 0.031066770980303738,\n \"acc_norm\": 0.26755149869038447,\n\
\ \"acc_norm_stderr\": 0.031835502327294145,\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757447,\n \"mc2\": 0.3732177557725045,\n\
\ \"mc2_stderr\": 0.013798981933202878\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3046075085324232,\n \"acc_stderr\": 0.01344952210993249,\n\
\ \"acc_norm\": 0.3387372013651877,\n \"acc_norm_stderr\": 0.01383056892797433\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4493128858793069,\n\
\ \"acc_stderr\": 0.00496407587012034,\n \"acc_norm\": 0.6030671181039634,\n\
\ \"acc_norm_stderr\": 0.004882619484166595\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816503,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.11851851851851852,\n\
\ \"acc_stderr\": 0.027922050250639055,\n \"acc_norm\": 0.11851851851851852,\n\
\ \"acc_norm_stderr\": 0.027922050250639055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.15789473684210525,\n \"acc_stderr\": 0.029674167520101456,\n\
\ \"acc_norm\": 0.15789473684210525,\n \"acc_norm_stderr\": 0.029674167520101456\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.02648035717989569,\n\
\ \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.02648035717989569\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.02989614568209546,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.02989614568209546\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.18421052631578946,\n\
\ \"acc_stderr\": 0.03646758875075566,\n \"acc_norm\": 0.18421052631578946,\n\
\ \"acc_norm_stderr\": 0.03646758875075566\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776578,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776578\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.22258064516129034,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114485,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.0298575156733864,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.0298575156733864\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.03074890536390988,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.03074890536390988\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.27692307692307694,\n \"acc_stderr\": 0.022688042352424994,\n\
\ \"acc_norm\": 0.27692307692307694,\n \"acc_norm_stderr\": 0.022688042352424994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341933,\n\
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341933\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24220183486238533,\n \"acc_stderr\": 0.018368176306598618,\n \"\
acc_norm\": 0.24220183486238533,\n \"acc_norm_stderr\": 0.018368176306598618\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.033981108902946366,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.033981108902946366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.02977177522814565,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02977177522814565\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.22784810126582278,\n \"acc_stderr\": 0.027303484599069422,\n \
\ \"acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.027303484599069422\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.35874439461883406,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591204,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591204\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n\
\ \"acc_stderr\": 0.015696008563807096,\n \"acc_norm\": 0.26053639846743293,\n\
\ \"acc_norm_stderr\": 0.015696008563807096\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2346368715083799,\n\
\ \"acc_stderr\": 0.014173044098303654,\n \"acc_norm\": 0.2346368715083799,\n\
\ \"acc_norm_stderr\": 0.014173044098303654\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.024954184324879912,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.024954184324879912\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22340425531914893,\n \"acc_stderr\": 0.02484792135806396,\n \
\ \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.02484792135806396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2242503259452412,\n\
\ \"acc_stderr\": 0.010652615824906172,\n \"acc_norm\": 0.2242503259452412,\n\
\ \"acc_norm_stderr\": 0.010652615824906172\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.029289413409403196,\n\
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.029289413409403196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528044,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528044\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.14285714285714285,\n \"acc_stderr\": 0.022401787435256386,\n\
\ \"acc_norm\": 0.14285714285714285,\n \"acc_norm_stderr\": 0.022401787435256386\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n\
\ \"acc_stderr\": 0.035915667978246635,\n \"acc_norm\": 0.3072289156626506,\n\
\ \"acc_norm_stderr\": 0.035915667978246635\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.03446296217088426,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.03446296217088426\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757447,\n \"mc2\": 0.3732177557725045,\n\
\ \"mc2_stderr\": 0.013798981933202878\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5951065509076559,\n \"acc_stderr\": 0.013795927003124934\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \
\ \"acc_stderr\": 0.0032820559171369596\n }\n}\n```"
repo_url: https://huggingface.co/ozayezerceli/TinyLlamax2-1.1b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|arc:challenge|25_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|gsm8k|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hellaswag|10_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T22-12-18.361507.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T22-12-18.361507.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- '**/details_harness|winogrande|5_2024-01-25T22-12-18.361507.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T22-12-18.361507.parquet'
- config_name: results
data_files:
- split: 2024_01_25T22_12_18.361507
path:
- results_2024-01-25T22-12-18.361507.parquet
- split: latest
path:
- results_2024-01-25T22-12-18.361507.parquet
---
# Dataset Card for Evaluation run of ozayezerceli/TinyLlamax2-1.1b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ozayezerceli/TinyLlamax2-1.1b](https://huggingface.co/ozayezerceli/TinyLlamax2-1.1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ozayezerceli__TinyLlamax2-1.1b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T22:12:18.361507](https://huggingface.co/datasets/open-llm-leaderboard/details_ozayezerceli__TinyLlamax2-1.1b/blob/main/results_2024-01-25T22-12-18.361507.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.265691244274486,
"acc_stderr": 0.031066770980303738,
"acc_norm": 0.26755149869038447,
"acc_norm_stderr": 0.031835502327294145,
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757447,
"mc2": 0.3732177557725045,
"mc2_stderr": 0.013798981933202878
},
"harness|arc:challenge|25": {
"acc": 0.3046075085324232,
"acc_stderr": 0.01344952210993249,
"acc_norm": 0.3387372013651877,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.4493128858793069,
"acc_stderr": 0.00496407587012034,
"acc_norm": 0.6030671181039634,
"acc_norm_stderr": 0.004882619484166595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816503,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.11851851851851852,
"acc_stderr": 0.027922050250639055,
"acc_norm": 0.11851851851851852,
"acc_norm_stderr": 0.027922050250639055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.15789473684210525,
"acc_stderr": 0.029674167520101456,
"acc_norm": 0.15789473684210525,
"acc_norm_stderr": 0.029674167520101456
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.02648035717989569,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.02648035717989569
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.03646758875075566,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.03646758875075566
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776578,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776578
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114485,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.0298575156733864,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.0298575156733864
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.03074890536390988,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.03074890536390988
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.27692307692307694,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.27692307692307694,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341933,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341933
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24220183486238533,
"acc_stderr": 0.018368176306598618,
"acc_norm": 0.24220183486238533,
"acc_norm_stderr": 0.018368176306598618
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02977177522814565,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02977177522814565
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22784810126582278,
"acc_stderr": 0.027303484599069422,
"acc_norm": 0.22784810126582278,
"acc_norm_stderr": 0.027303484599069422
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591204,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591204
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26053639846743293,
"acc_stderr": 0.015696008563807096,
"acc_norm": 0.26053639846743293,
"acc_norm_stderr": 0.015696008563807096
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22254335260115607,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.22254335260115607,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2346368715083799,
"acc_stderr": 0.014173044098303654,
"acc_norm": 0.2346368715083799,
"acc_norm_stderr": 0.014173044098303654
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.02484792135806396,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.02484792135806396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2242503259452412,
"acc_stderr": 0.010652615824906172,
"acc_norm": 0.2242503259452412,
"acc_norm_stderr": 0.010652615824906172
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.029289413409403196,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.029289413409403196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528044,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528044
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.022401787435256386,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.022401787435256386
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.035915667978246635,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.035915667978246635
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.03446296217088426,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.03446296217088426
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757447,
"mc2": 0.3732177557725045,
"mc2_stderr": 0.013798981933202878
},
"harness|winogrande|5": {
"acc": 0.5951065509076559,
"acc_stderr": 0.013795927003124934
},
"harness|gsm8k|5": {
"acc": 0.014404852160727824,
"acc_stderr": 0.0032820559171369596
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
maloyan/vqgan1024_reconstruction | ---
dataset_info:
features:
- name: image_512
dtype: image
- name: image_256
dtype: image
- name: reconstruction_256
dtype: image
splits:
- name: train
num_bytes: 3446042724.0
num_examples: 100000
download_size: 4331449801
dataset_size: 3446042724.0
---
# Dataset Card for "vqgan1024_reconstruction"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
agibot-zy/zy_dataset_jaka | ---
license: mit
---
|
danaroth/cuprite | ---
license: unknown
---
# Description
This data sets can be retrieved from [AVIRIS NASA](http://aviris.jpl.nasa.gov/) site. Among the many datasets available, the .mat archive posted here corresponds to the _f970619t01p02_r02_sc03.a.rfl_ reflectance file.
# Quick look
<figure>
<img src= "assets/Cuprite_false_greyscale.png" alt="Cuprite" width="300" />
<figcaption>False greyscale image of Cuprite sample.</figcaption>
</figure>
# Credits
This dataset was originally collected by Manuel Graña, Miguel-Angel Veganzones, Borja Ayerdi.
The original link for the dataset is available below:
https://www.ehu.eus/ccwintco/index.php/Hyperspectral_Remote_Sensing_Scenes |
Soressaa/NER | ---
license: apache-2.0
task_categories:
- token-classification
- text-classification
language:
- om
tags:
- code
pretty_name: transformers, pytorch
--- |
XisDraki3142/heve | ---
license: openrail
---
|
open-llm-leaderboard/details_psmathur__model_420 | ---
pretty_name: Evaluation run of psmathur/model_420
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/model_420](https://huggingface.co/psmathur/model_420) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__model_420\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T12:29:32.127683](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_420/blob/main/results_2023-10-25T12-29-32.127683.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07770553691275167,\n\
\ \"em_stderr\": 0.002741576916689869,\n \"f1\": 0.1435245385906032,\n\
\ \"f1_stderr\": 0.0028999685202973128,\n \"acc\": 0.5616169002251712,\n\
\ \"acc_stderr\": 0.01140770950597949\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.07770553691275167,\n \"em_stderr\": 0.002741576916689869,\n\
\ \"f1\": 0.1435245385906032,\n \"f1_stderr\": 0.0028999685202973128\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.28582259287338896,\n \
\ \"acc_stderr\": 0.01244496346061563\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343348\n\
\ }\n}\n```"
repo_url: https://huggingface.co/psmathur/model_420
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|arc:challenge|25_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T12_29_32.127683
path:
- '**/details_harness|drop|3_2023-10-25T12-29-32.127683.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T12-29-32.127683.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T12_29_32.127683
path:
- '**/details_harness|gsm8k|5_2023-10-25T12-29-32.127683.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T12-29-32.127683.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hellaswag|10_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:30:53.861982.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T21:30:53.861982.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T21:30:53.861982.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T12_29_32.127683
path:
- '**/details_harness|winogrande|5_2023-10-25T12-29-32.127683.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T12-29-32.127683.parquet'
- config_name: results
data_files:
- split: 2023_08_09T21_30_53.861982
path:
- results_2023-08-09T21:30:53.861982.parquet
- split: 2023_10_25T12_29_32.127683
path:
- results_2023-10-25T12-29-32.127683.parquet
- split: latest
path:
- results_2023-10-25T12-29-32.127683.parquet
---
# Dataset Card for Evaluation run of psmathur/model_420
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/model_420
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/model_420](https://huggingface.co/psmathur/model_420) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__model_420",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T12:29:32.127683](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_420/blob/main/results_2023-10-25T12-29-32.127683.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.07770553691275167,
"em_stderr": 0.002741576916689869,
"f1": 0.1435245385906032,
"f1_stderr": 0.0028999685202973128,
"acc": 0.5616169002251712,
"acc_stderr": 0.01140770950597949
},
"harness|drop|3": {
"em": 0.07770553691275167,
"em_stderr": 0.002741576916689869,
"f1": 0.1435245385906032,
"f1_stderr": 0.0028999685202973128
},
"harness|gsm8k|5": {
"acc": 0.28582259287338896,
"acc_stderr": 0.01244496346061563
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343348
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
pioivenium/marketo-full-json | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 393683
num_examples: 1389
download_size: 180348
dataset_size: 393683
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ccmusic-database/CTIS | ---
license: mit
task_categories:
- audio-classification
language:
- zh
- en
tags:
- music
- art
pretty_name: Chinese Traditional Instrument Sound Dataset
size_categories:
- 1K<n<10K
viewer: false
---
# Dataset Card for Chinese Traditional Instrument Sound
The raw CTIS dataset contains recordings from 287 varieties of Chinese traditional instruments, modified Chinese musical instruments, and instruments from ethnic minority groups. Notably, some of these instruments are rarely encountered by the majority of the Chinese populace.
## Dataset Description
- **Homepage:** <https://ccmusic-database.github.io>
- **Repository:** <https://huggingface.co/datasets/ccmusic-database/CTIS>
- **Paper:** <https://doi.org/10.5281/zenodo.5676893>
- **Leaderboard:** <https://ccmusic-database.github.io/team.html>
- **Point of Contact:** <https://www.modelscope.cn/datasets/ccmusic/CTIS>
## Maintenance
```bash
GIT_LFS_SKIP_SMUDGE=1 git clone git@hf.co:datasets/ccmusic-database/CTIS
cd CTIS
```
### Dataset Summary
During the integration, we first performed data cleaning to remove recordings with no specific instrument labels. The filtered dataset comprises recordings of 200 types of Chinese traditional musical instruments, totaling 3,974 audio clips. On average, there are approximately 20 audio clips per instrument. The data structure of the integrated dataset consists of three columns: an audio column containing audio files in .wav format, all sampled at a uniform rate of 22,050 Hz, a label column with 200 categories corresponding to the Chinese pinyin of the instrument names, and an additional column for the Chinese instrument names. This integrated dataset can be utilized for tasks such as Chinese instrument recognition or instrument acoustic analysis.
### Supported Tasks and Leaderboards
MIR, audio classification
### Languages
Chinese, English
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("ccmusic-database/CTIS")
for item in ds["train"]:
print(item)
for item in ds["validation"]:
print(item)
for item in ds["test"]:
print(item)
```
## Dataset Structure
| audio(.wav, 22050Hz) | mel(.jpg, 22050Hz) | label(200-class) | cname(string) |
| :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :-------------------------------------------------: | :------------------------------: | :---------------------------------: |
| <audio controls src="https://huggingface.co/datasets/ccmusic-database/CTIS/resolve/main/data/%E3%80%90%E5%A4%A7%E7%AC%92%E3%80%91%E6%BC%94%E5%A5%8F%E6%8A%80%E6%B3%95%20%E5%8D%8E%E5%BD%A9%20%E7%AC%AC%E4%B8%80%E9%81%8D.wav"> | <img src="./data/【大笒】演奏技法 华彩 第一遍.jpg"> | C0090/<br>C0091/<br>...<br>T0323 | 大笒/<br>高音横笛/<br>...<br>都它尔 |
| ... | ... | ... | ... |
### Data Instances
.zip(.wav), .csv
### Data Fields
Up to 287 kinds of Chinese traditional musical instruments, improved Chinese musical instruments and Chinese ethnic musical instruments
### Data Splits
instruments, percussion
## Dataset Creation
### Curation Rationale
Lack of a dataset for Chinese traditional musical instruments
### Source Data
#### Initial Data Collection and Normalization
Zhaorui Liu, Monan Zhou
#### Who are the source language producers?
Students from CCMUSIC
### Annotations
#### Annotation process
Building a high-quality musical sound database requires consideration on every aspect of the criteria in terms of the recording environment, performer, sample content, annotation standard and quality of recording and performing.
#### Who are the annotators?
Students from CCMUSIC
### Personal and Sensitive Information
None
## Considerations for Using the Data
### Social Impact of Dataset
Advancing the Digitization Process of Traditional Chinese Instruments
### Discussion of Biases
Only for Traditional Chinese Instruments
### Other Known Limitations
Sample imbalance
## Additional Information
### Dataset Curators
Zijin Li
### Evaluation
[李子晋, 韩宝强. 中国传统乐器音响数据库构建研究[J]. 中国音乐学, 2020(02):92-102+2.](https://kns.cnki.net/kcms/detail/detail.aspx?dbcode=CJFD&dbname=CJFDLAST2020&filename=ZYYX202002013&uniplatform=NZKPT&v=7XgjFhWwxaqXz5fg8DIhkJzfNT6gX9huNMH0y5oRG15SXfwDzqWIOuuquyUDS%25mmd2FJ9)
[Liang X, Li Z, Liu J, et al. Constructing a multimedia Chinese musical instrument database[C]//Proceedings of the 6th Conference on Sound and Music Technology (CSMT). Springer, Singapore, 2019: 53-60.](https://link.springer.com/chapter/10.1007/978-981-13-8707-4_5)
[Li Z, Liang X, Liu J, et al. DCMI: A Database of Chinese Musical Instruments[J].](https://dlfm.web.ox.ac.uk/sites/default/files/dlfm/documents/media/zijin-et-al-dcmi.pdf)
### Licensing Information
```
MIT License
Copyright (c) CCMUSIC
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
```
### Citation Information
```bibtex
@dataset{zhaorui_liu_2021_5676893,
author = {Monan Zhou, Shenyang Xu, Zhaorui Liu, Zhaowen Wang, Feng Yu, Wei Li and Baoqiang Han},
title = {CCMusic: an Open and Diverse Database for Chinese and General Music Information Retrieval Research},
month = {mar},
year = {2024},
publisher = {HuggingFace},
version = {1.2},
url = {https://huggingface.co/ccmusic-database}
}
```
### Contributions
Provide a dataset for Chinese Traditional Instrument Sounds |
tyzhu/squad_qa_baseline_v5_full_random_permute_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 2496440.0
num_examples: 2385
- name: validation
num_bytes: 335684
num_examples: 300
download_size: 667968
dataset_size: 2832124.0
---
# Dataset Card for "squad_qa_baseline_v5_full_random_permute_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Zangs3011__mistral_7b_HalfEpoch_DolphinCoder | ---
pretty_name: Evaluation run of Zangs3011/mistral_7b_HalfEpoch_DolphinCoder
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Zangs3011/mistral_7b_HalfEpoch_DolphinCoder](https://huggingface.co/Zangs3011/mistral_7b_HalfEpoch_DolphinCoder)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Zangs3011__mistral_7b_HalfEpoch_DolphinCoder\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-19T04:49:07.320364](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__mistral_7b_HalfEpoch_DolphinCoder/blob/main/results_2024-01-19T04-49-07.320364.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.611565024381518,\n\
\ \"acc_stderr\": 0.032882150972704735,\n \"acc_norm\": 0.6180275851937712,\n\
\ \"acc_norm_stderr\": 0.03355752116870104,\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.4550661709609789,\n\
\ \"mc2_stderr\": 0.014832885424648957\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216384,\n\
\ \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672877\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6356303525194185,\n\
\ \"acc_stderr\": 0.004802694106203654,\n \"acc_norm\": 0.8238398725353515,\n\
\ \"acc_norm_stderr\": 0.00380177777980958\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.029373646253234686,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.029373646253234686\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817729,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817729\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.02475747390275205,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.02475747390275205\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7258064516129032,\n \"acc_stderr\": 0.025378139970885203,\n \"\
acc_norm\": 0.7258064516129032,\n \"acc_norm_stderr\": 0.025378139970885203\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8055045871559633,\n \"acc_stderr\": 0.016970289090458033,\n \"\
acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.016970289090458033\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035293,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035293\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n\
\ \"acc_stderr\": 0.014351702181636863,\n \"acc_norm\": 0.7982120051085568,\n\
\ \"acc_norm_stderr\": 0.014351702181636863\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.024405173935783227,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.024405173935783227\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3396648044692737,\n\
\ \"acc_stderr\": 0.015839400406212505,\n \"acc_norm\": 0.3396648044692737,\n\
\ \"acc_norm_stderr\": 0.015839400406212505\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n\
\ \"acc_stderr\": 0.012673969883493272,\n \"acc_norm\": 0.438722294654498,\n\
\ \"acc_norm_stderr\": 0.012673969883493272\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215927,\n \
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215927\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.03029950656215418,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.03029950656215418\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.4550661709609789,\n\
\ \"mc2_stderr\": 0.014832885424648957\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174789\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30477634571645185,\n \
\ \"acc_stderr\": 0.012679297549515418\n }\n}\n```"
repo_url: https://huggingface.co/Zangs3011/mistral_7b_HalfEpoch_DolphinCoder
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|arc:challenge|25_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|gsm8k|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hellaswag|10_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T04-49-07.320364.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T04-49-07.320364.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- '**/details_harness|winogrande|5_2024-01-19T04-49-07.320364.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-19T04-49-07.320364.parquet'
- config_name: results
data_files:
- split: 2024_01_19T04_49_07.320364
path:
- results_2024-01-19T04-49-07.320364.parquet
- split: latest
path:
- results_2024-01-19T04-49-07.320364.parquet
---
# Dataset Card for Evaluation run of Zangs3011/mistral_7b_HalfEpoch_DolphinCoder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Zangs3011/mistral_7b_HalfEpoch_DolphinCoder](https://huggingface.co/Zangs3011/mistral_7b_HalfEpoch_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Zangs3011__mistral_7b_HalfEpoch_DolphinCoder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T04:49:07.320364](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__mistral_7b_HalfEpoch_DolphinCoder/blob/main/results_2024-01-19T04-49-07.320364.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.611565024381518,
"acc_stderr": 0.032882150972704735,
"acc_norm": 0.6180275851937712,
"acc_norm_stderr": 0.03355752116870104,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.4550661709609789,
"mc2_stderr": 0.014832885424648957
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216384,
"acc_norm": 0.6168941979522184,
"acc_norm_stderr": 0.014206472661672877
},
"harness|hellaswag|10": {
"acc": 0.6356303525194185,
"acc_stderr": 0.004802694106203654,
"acc_norm": 0.8238398725353515,
"acc_norm_stderr": 0.00380177777980958
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.029373646253234686,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.029373646253234686
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817729,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817729
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.02475747390275205,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.02475747390275205
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.025378139970885203,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.025378139970885203
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.02503387058301518,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.02503387058301518
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.016970289090458033,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.016970289090458033
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035293,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035293
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560396,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560396
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7982120051085568,
"acc_stderr": 0.014351702181636863,
"acc_norm": 0.7982120051085568,
"acc_norm_stderr": 0.014351702181636863
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.024405173935783227,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.024405173935783227
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3396648044692737,
"acc_stderr": 0.015839400406212505,
"acc_norm": 0.3396648044692737,
"acc_norm_stderr": 0.015839400406212505
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493272,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493272
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.019559646809215927,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.019559646809215927
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.03029950656215418,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.03029950656215418
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.4550661709609789,
"mc2_stderr": 0.014832885424648957
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174789
},
"harness|gsm8k|5": {
"acc": 0.30477634571645185,
"acc_stderr": 0.012679297549515418
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BlazeLlama/euclid_elements_eng_propositions | ---
license: apache-2.0
---
|
HeshamHaroon/ArzEn-MultiGenre | ---
license: cc-by-4.0
task_categories:
- translation
language:
- ar
- en
size_categories:
- 1K<n<10K
---
# ArzEn-MultiGenre: A Comprehensive Parallel Dataset
## Overview
ArzEn-MultiGenre is a distinctive parallel dataset that encompasses a diverse collection of Egyptian Arabic content. This collection includes song lyrics, novels, and TV show subtitles, all of which have been meticulously translated and aligned with their English counterparts. The dataset serves as an invaluable tool for various linguistic and computational applications.
**Published:** 28 December 2023
**Version:** 3
**DOI:** 10.17632/6k97jty9xg.3
**Contributor:** Rania Al-Sabbagh
## Dataset Details
- **Total Segment Pairs:** 25,557
- **Languages:** Egyptian Arabic and English
- **Content Types:** Song Lyrics, Novels, TV Show Subtitles
## Applications
- **Machine Translation Benchmarking:** Ideal for testing and improving new machine translation models.
- **Language Model Fine-Tuning:** Suitable for enhancing large language models in few-shot settings.
- **Commercial Application Adaptation:** Can be used to refine tools like Google Translate for better performance with Egyptian Arabic.
## Research Relevance
This dataset is a significant resource for research in fields such as translation studies, cross-linguistic analysis, and lexical semantics.
## Unique Contributions
1. **Diverse Textual Genres:** The dataset includes genres not typically found in parallel datasets for Egyptian Arabic and English.
2. **Gold-Standard Quality:** Translated and aligned by human experts, ensuring high accuracy and reliability.
## Citation
Please cite this dataset as follows:
Al-Sabbagh, Rania (2023). “ArzEn-MultiGenre: An aligned parallel dataset of Egyptian Arabic song lyrics, novels, and subtitles, with English translations.” Mendeley Data, V3, DOI: 10.17632/6k97jty9xg.3
## Related Links
- [Article](https://ijaes2011.net/index.php/IJAES/article/view/560)
## Institutions
- University of Sharjah |
fundrais123/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: string
- name: updated_at
dtype: string
- name: closed_at
dtype: string
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: draft
dtype: bool
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 24117020
num_examples: 4000
download_size: 6802855
dataset_size: 24117020
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
neil-code/subset-data-en-zh | ---
license: other
---
|
ash11sh/Telangana_District_Mandal_Shape_Files | ---
license: other
license_name: open-government-data-license
license_link: https://data.gov.in/sites/default/files/Gazette_Notification_OGDL.pdf
---
Telangana District and Mandal Shape Files
Click the below link to download the Telangana Shape Files. This contains 33 District and 632 Mandal, Revenue Shapefiles of Telangana
Last Updated on: 18th October 2023.
source from https://data.telangana.gov.in/telangana-district-and-mandal-shape-files? |
gaizerick/dianav2 | ---
license: openrail
---
|
ramosramon433/adsa | ---
license: apache-2.0
---
|
purplebear/dreambooth-hackathon-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 54613224.0
num_examples: 20
download_size: 54616715
dataset_size: 54613224.0
---
# Dataset Card for "dreambooth-hackathon-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.0 | ---
pretty_name: Evaluation run of LeroyDyer/Mixtral_AI_Cyber_3.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeroyDyer/Mixtral_AI_Cyber_3.0](https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_3.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T20:46:43.223151](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.0/blob/main/results_2024-03-29T20-46-43.223151.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6197012396861927,\n\
\ \"acc_stderr\": 0.03276132338243605,\n \"acc_norm\": 0.6229738232191697,\n\
\ \"acc_norm_stderr\": 0.033427474229373366,\n \"mc1\": 0.40758873929008566,\n\
\ \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.5821381038382072,\n\
\ \"mc2_stderr\": 0.015259206479110166\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.591296928327645,\n \"acc_stderr\": 0.014365750345427,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.01415063143511173\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6584345747858992,\n\
\ \"acc_stderr\": 0.004732654295724446,\n \"acc_norm\": 0.8401712806213901,\n\
\ \"acc_norm_stderr\": 0.003656982165386171\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462457,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462457\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764815,\n \"\
acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764815\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.02394672474156397,\n \
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.02394672474156397\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608452,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608452\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.03017680828897434,\n \
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.03017680828897434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203627,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203627\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375798,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375798\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419996,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419996\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489288,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489288\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\
\ \"acc_stderr\": 0.014419123980931894,\n \"acc_norm\": 0.7956577266922095,\n\
\ \"acc_norm_stderr\": 0.014419123980931894\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.02507071371915319,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.02507071371915319\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3687150837988827,\n\
\ \"acc_stderr\": 0.01613575901503012,\n \"acc_norm\": 0.3687150837988827,\n\
\ \"acc_norm_stderr\": 0.01613575901503012\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.02640614597362568,\n\
\ \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.02640614597362568\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778852,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778852\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n\
\ \"acc_stderr\": 0.012691575792657114,\n \"acc_norm\": 0.4445893089960887,\n\
\ \"acc_norm_stderr\": 0.012691575792657114\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777508,\n \
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777508\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n\
\ \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.5821381038382072,\n\
\ \"mc2_stderr\": 0.015259206479110166\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.011201862744487048\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45943896891584535,\n \
\ \"acc_stderr\": 0.013727093010429785\n }\n}\n```"
repo_url: https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_3.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-46-43.223151.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-46-43.223151.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- '**/details_harness|winogrande|5_2024-03-29T20-46-43.223151.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T20-46-43.223151.parquet'
- config_name: results
data_files:
- split: 2024_03_29T20_46_43.223151
path:
- results_2024-03-29T20-46-43.223151.parquet
- split: latest
path:
- results_2024-03-29T20-46-43.223151.parquet
---
# Dataset Card for Evaluation run of LeroyDyer/Mixtral_AI_Cyber_3.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LeroyDyer/Mixtral_AI_Cyber_3.0](https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_3.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T20:46:43.223151](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_3.0/blob/main/results_2024-03-29T20-46-43.223151.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6197012396861927,
"acc_stderr": 0.03276132338243605,
"acc_norm": 0.6229738232191697,
"acc_norm_stderr": 0.033427474229373366,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.5821381038382072,
"mc2_stderr": 0.015259206479110166
},
"harness|arc:challenge|25": {
"acc": 0.591296928327645,
"acc_stderr": 0.014365750345427,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.01415063143511173
},
"harness|hellaswag|10": {
"acc": 0.6584345747858992,
"acc_stderr": 0.004732654295724446,
"acc_norm": 0.8401712806213901,
"acc_norm_stderr": 0.003656982165386171
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462457,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462457
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764815,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764815
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.02394672474156397,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.02394672474156397
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608452,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608452
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203627,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419996,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419996
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489288,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489288
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7956577266922095,
"acc_stderr": 0.014419123980931894,
"acc_norm": 0.7956577266922095,
"acc_norm_stderr": 0.014419123980931894
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.02507071371915319,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.02507071371915319
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3687150837988827,
"acc_stderr": 0.01613575901503012,
"acc_norm": 0.3687150837988827,
"acc_norm_stderr": 0.01613575901503012
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153262,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.02640614597362568,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.02640614597362568
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778852,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778852
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.012691575792657114,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.012691575792657114
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777508,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777508
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.5821381038382072,
"mc2_stderr": 0.015259206479110166
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.011201862744487048
},
"harness|gsm8k|5": {
"acc": 0.45943896891584535,
"acc_stderr": 0.013727093010429785
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/noshiro_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of noshiro/能代/能代 (Kantai Collection)
This is the dataset of noshiro/能代/能代 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `brown_hair, braid, long_hair, twin_braids, green_eyes, breasts, bangs, large_breasts, swept_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 544.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/noshiro_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 330.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/noshiro_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1201 | 707.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/noshiro_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 494.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/noshiro_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1201 | 971.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/noshiro_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/noshiro_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, black_bikini, solo, white_shirt, cleavage, tied_shirt, bikini_under_clothes, looking_at_viewer, one-hour_drawing_challenge, cowboy_shot, upper_body, simple_background, official_alternate_costume, twitter_username, white_background, wrist_scrunchie, red_skirt, dated, midriff, navel, red_scrunchie |
| 1 | 11 |  |  |  |  |  | 1girl, black_bikini, cleavage, day, white_shirt, blue_sky, cloud, looking_at_viewer, navel, outdoors, solo, tied_shirt, collarbone, blush, cowboy_shot, hair_between_eyes, red_shorts, beach, bikini_under_clothes, collared_shirt, wrist_scrunchie, ocean, smile, open_mouth, red_scrunchie |
| 2 | 6 |  |  |  |  |  | 1girl, black_bikini, cleavage, collarbone, looking_at_viewer, smile, solo, tied_shirt, white_shirt, beachball, blush, red_scrunchie, wrist_scrunchie, gradient_background, navel, upper_body, open_mouth, twitter_username |
| 3 | 31 |  |  |  |  |  | 1girl, serafuku, necktie, pleated_skirt, red_skirt, solo, white_gloves, black_sailor_collar, midriff, sleeveless_shirt, anchor_symbol, looking_at_viewer, cleavage, simple_background, navel, single_thighhigh, garter_straps, white_background, cowboy_shot, uneven_legwear |
| 4 | 19 |  |  |  |  |  | 1girl, serafuku, solo, looking_at_viewer, white_gloves, cleavage, garter_straps, single_thighhigh, pleated_skirt, blush, midriff, navel, open_mouth, necktie |
| 5 | 11 |  |  |  |  |  | 1girl, black_sailor_collar, black_skirt, dress_shirt, long_sleeves, pleated_skirt, sailor_shirt, serafuku, solo, black_shirt, garter_straps, looking_at_viewer, cowboy_shot, simple_background, white_background, belt, black_thighhighs |
| 6 | 6 |  |  |  |  |  | 1girl, black_sailor_collar, black_shirt, black_skirt, dress_shirt, hair_between_eyes, long_sleeves, pleated_skirt, solo, black_belt, looking_at_viewer, sailor_shirt, serafuku, simple_background, cowboy_shot, smile, white_background, blush, garter_straps |
| 7 | 5 |  |  |  |  |  | 1girl, black_sailor_collar, long_sleeves, sailor_shirt, serafuku, solo, upper_body, black_shirt, dress_shirt, looking_at_viewer, blue_sailor_collar, simple_background, white_background |
| 8 | 5 |  |  |  |  |  | 1girl, alternate_costume, black_sweater, long_sleeves, looking_at_viewer, ribbed_sweater, solo, black_pantyhose, blush, pleated_skirt, simple_background, brown_skirt, cowboy_shot, hair_between_eyes, turtleneck_sweater, blue_sweater, open_mouth, smile, white_background |
| 9 | 19 |  |  |  |  |  | 1girl, 1boy, blush, hetero, solo_focus, white_gloves, penis, nipples, open_mouth, paizuri, looking_at_viewer, cum_on_breasts, school_uniform, bar_censor, pubic_hair |
| 10 | 5 |  |  |  |  |  | 1boy, 1girl, cowgirl_position, girl_on_top, hetero, navel, nipples, sex, solo_focus, vaginal, cum_in_pussy, open_mouth, blush, completely_nude, female_pubic_hair, looking_at_viewer, bouncing_breasts, censored, penis, sweat, white_gloves |
| 11 | 24 |  |  |  |  |  | playboy_bunny, rabbit_ears, 1girl, solo, detached_collar, strapless_leotard, fake_animal_ears, wrist_cuffs, looking_at_viewer, cleavage, rabbit_tail, black_pantyhose, cowboy_shot, simple_background, white_background, alternate_costume, necktie, bowtie, red_leotard, black_leotard |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_bikini | solo | white_shirt | cleavage | tied_shirt | bikini_under_clothes | looking_at_viewer | one-hour_drawing_challenge | cowboy_shot | upper_body | simple_background | official_alternate_costume | twitter_username | white_background | wrist_scrunchie | red_skirt | dated | midriff | navel | red_scrunchie | day | blue_sky | cloud | outdoors | collarbone | blush | hair_between_eyes | red_shorts | beach | collared_shirt | ocean | smile | open_mouth | beachball | gradient_background | serafuku | necktie | pleated_skirt | white_gloves | black_sailor_collar | sleeveless_shirt | anchor_symbol | single_thighhigh | garter_straps | uneven_legwear | black_skirt | dress_shirt | long_sleeves | sailor_shirt | black_shirt | belt | black_thighhighs | black_belt | blue_sailor_collar | alternate_costume | black_sweater | ribbed_sweater | black_pantyhose | brown_skirt | turtleneck_sweater | blue_sweater | 1boy | hetero | solo_focus | penis | nipples | paizuri | cum_on_breasts | school_uniform | bar_censor | pubic_hair | cowgirl_position | girl_on_top | sex | vaginal | cum_in_pussy | completely_nude | female_pubic_hair | bouncing_breasts | censored | sweat | playboy_bunny | rabbit_ears | detached_collar | strapless_leotard | fake_animal_ears | wrist_cuffs | rabbit_tail | bowtie | red_leotard | black_leotard |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:-------|:--------------|:-----------|:-------------|:-----------------------|:--------------------|:-----------------------------|:--------------|:-------------|:--------------------|:-----------------------------|:-------------------|:-------------------|:------------------|:------------|:--------|:----------|:--------|:----------------|:------|:-----------|:--------|:-----------|:-------------|:--------|:--------------------|:-------------|:--------|:-----------------|:--------|:--------|:-------------|:------------|:----------------------|:-----------|:----------|:----------------|:---------------|:----------------------|:-------------------|:----------------|:-------------------|:----------------|:-----------------|:--------------|:--------------|:---------------|:---------------|:--------------|:-------|:-------------------|:-------------|:---------------------|:--------------------|:----------------|:-----------------|:------------------|:--------------|:---------------------|:---------------|:-------|:---------|:-------------|:--------|:----------|:----------|:-----------------|:-----------------|:-------------|:-------------|:-------------------|:--------------|:------|:----------|:---------------|:------------------|:--------------------|:-------------------|:-----------|:--------|:----------------|:--------------|:------------------|:--------------------|:-------------------|:--------------|:--------------|:---------|:--------------|:----------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | | X | | | X | | | X | | X | | | | X | X | | | | | X | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 31 |  |  |  |  |  | X | | X | | X | | | X | | X | | X | | | X | | X | | X | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 19 |  |  |  |  |  | X | | X | | X | | | X | | | | | | | | | | | X | X | | | | | | | X | | | | | | | X | | | X | X | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | | X | | | | | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | X | | X | | X | | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | X | | | | | X | | X | | X | | | X | | | | | | | | | | | | X | X | | | | | X | | | | X | | X | | X | | | | X | | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | X | | | | | X | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | X | X | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | | X | | | | | X | | X | | X | | | X | | | | | | | | | | | | X | X | | | | | X | X | | | | | X | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 19 |  |  |  |  |  | X | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | | | | | | | X | | | | | | | | | | | | X | | | | | | | X | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 11 | 24 |  |  |  |  |  | X | | X | | X | | | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
maximegmd/medmcqa_alpaca_format | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: solution
dtype: string
splits:
- name: train
num_bytes: 120644997
num_examples: 182822
- name: test
num_bytes: 1077057
num_examples: 6150
- name: validation
num_bytes: 2009220
num_examples: 4183
download_size: 79503290
dataset_size: 123731274
---
# Dataset Card for "medmcqa_alpaca_format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GEM/common_gen | ---
annotations_creators:
- none
language_creators:
- unknown
language:
- en
license:
- mit
multilinguality:
- unknown
size_categories:
- unknown
source_datasets:
- original
task_categories:
- other
task_ids: []
pretty_name: common_gen
tags:
- reasoning
---
# Dataset Card for GEM/common_gen
## Dataset Description
- **Homepage:** https://inklab.usc.edu/CommonGen/
- **Repository:** https://github.com/INK-USC/CommonGen
- **Paper:** https://aclanthology.org/2020.findings-emnlp.165
- **Leaderboard:** https://inklab.usc.edu/CommonGen/leaderboard.html
- **Point of Contact:** Bill Yuchen Lin
### Link to Main Data Card
You can find the main data card on the [GEM Website](https://gem-benchmark.com/data_cards/common_gen).
### Dataset Summary
CommonGen is an English text generation task to explicitly test machines for the ability of generative commonsense reasoning. Given a set of common concepts, the task is to generate a coherent sentence describing an everyday scenario using these concepts. CommonGen is challenging because it inherently requires 1) relational reasoning using background commonsense knowledge, and 2) compositional generalization ability to work on unseen concept combinations. The dataset, constructed through a combination of crowd-sourcing from AMT and existing caption corpora, consists of 30k concept-sets and 50k sentences in total. Note that the CommonGen test set is private and requires submission to the external leaderboard.
You can load the dataset via:
```
import datasets
data = datasets.load_dataset('GEM/common_gen')
```
The data loader can be found [here](https://huggingface.co/datasets/GEM/common_gen).
#### website
[link](https://inklab.usc.edu/CommonGen/)
#### paper
[Link](https://aclanthology.org/2020.findings-emnlp.165)
#### authors
Bill Yuchen Lin (USC), Wangchunshu Zhou (USC), Ming Shen (USC), Pei Zhou (USC), Chandra Bhagavatula (AllenAI), Yejin Choi (AllenAI + UW), Xiang Ren (USC)
## Dataset Overview
### Where to find the Data and its Documentation
#### Webpage
<!-- info: What is the webpage for the dataset (if it exists)? -->
<!-- scope: telescope -->
[link](https://inklab.usc.edu/CommonGen/)
#### Download
<!-- info: What is the link to where the original dataset is hosted? -->
<!-- scope: telescope -->
[Link](https://github.com/INK-USC/CommonGen)
#### Paper
<!-- info: What is the link to the paper describing the dataset (open access preferred)? -->
<!-- scope: telescope -->
[Link](https://aclanthology.org/2020.findings-emnlp.165)
#### BibTex
<!-- info: Provide the BibTex-formatted reference for the dataset. Please use the correct published version (ACL anthology, etc.) instead of google scholar created Bibtex. -->
<!-- scope: microscope -->
```
@inproceedings{lin-etal-2020-commongen,
title = "{C}ommon{G}en: A Constrained Text Generation Challenge for Generative Commonsense Reasoning",
author = "Lin, Bill Yuchen and
Zhou, Wangchunshu and
Shen, Ming and
Zhou, Pei and
Bhagavatula, Chandra and
Choi, Yejin and
Ren, Xiang",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2020",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.findings-emnlp.165",
pages = "1823--1840",
}
```
#### Contact Name
<!-- quick -->
<!-- info: If known, provide the name of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
Bill Yuchen Lin
#### Contact Email
<!-- info: If known, provide the email of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
yuchen.lin@usc.edu
#### Has a Leaderboard?
<!-- info: Does the dataset have an active leaderboard? -->
<!-- scope: telescope -->
yes
#### Leaderboard Link
<!-- info: Provide a link to the leaderboard. -->
<!-- scope: periscope -->
[Link](https://inklab.usc.edu/CommonGen/leaderboard.html)
#### Leaderboard Details
<!-- info: Briefly describe how the leaderboard evaluates models. -->
<!-- scope: microscope -->
The model outputs are evaluated against the crowdsourced references, and ranked by SPICE score. The leaderboard also reports BLEU-4 and CIDEr scores.
### Languages and Intended Use
#### Multilingual?
<!-- quick -->
<!-- info: Is the dataset multilingual? -->
<!-- scope: telescope -->
no
#### Covered Dialects
<!-- info: What dialects are covered? Are there multiple dialects per language? -->
<!-- scope: periscope -->
No information is provided on regional restrictions and we thus assume that the covered dialects are those spoken by raters on Mechanical Turk.
#### Covered Languages
<!-- quick -->
<!-- info: What languages/dialects are covered in the dataset? -->
<!-- scope: telescope -->
`English`
#### Whose Language?
<!-- info: Whose language is in the dataset? -->
<!-- scope: periscope -->
The concepts were extracted from multiple English image captioning datasets and the data was collected via Amazon Mechanical Turk. No information on regional restrictions is provided.
#### License
<!-- quick -->
<!-- info: What is the license of the dataset? -->
<!-- scope: telescope -->
mit: MIT License
#### Intended Use
<!-- info: What is the intended use of the dataset? -->
<!-- scope: microscope -->
CommonGen is a constrained text generation task, associated with a benchmark dataset, to explicitly test machines for the ability of generative commonsense reasoning.
#### Primary Task
<!-- info: What primary task does the dataset support? -->
<!-- scope: telescope -->
Reasoning
#### Communicative Goal
<!-- quick -->
<!-- info: Provide a short description of the communicative goal of a model trained for this task on this dataset. -->
<!-- scope: periscope -->
The speaker is required to produce a *coherent* sentence which mentions all of the source concepts, and which describes a *likely* situation that could be captured in a picture or video.
### Credit
#### Curation Organization Type(s)
<!-- info: In what kind of organization did the dataset curation happen? -->
<!-- scope: telescope -->
`academic`, `independent`
#### Curation Organization(s)
<!-- info: Name the organization(s). -->
<!-- scope: periscope -->
The dataset was curated by a joint team of researchers from the University of Southern California and Allen Institute for Artificial Intelligence.
#### Dataset Creators
<!-- info: Who created the original dataset? List the people involved in collecting the dataset and their affiliation(s). -->
<!-- scope: microscope -->
Bill Yuchen Lin (USC), Wangchunshu Zhou (USC), Ming Shen (USC), Pei Zhou (USC), Chandra Bhagavatula (AllenAI), Yejin Choi (AllenAI + UW), Xiang Ren (USC)
#### Funding
<!-- info: Who funded the data creation? -->
<!-- scope: microscope -->
The research is based upon work supported in part by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), the DARPA MCS program, and NSF SMA 18-29268.
#### Who added the Dataset to GEM?
<!-- info: Who contributed to the data card and adding the dataset to GEM? List the people+affiliations involved in creating this data card and who helped integrate this dataset into GEM. -->
<!-- scope: microscope -->
Yacine Jernite created the initial data card. It was later extended by Simon Mille. Sebastian Gehrmann migrated it to the GEMv2 format.
### Dataset Structure
#### Data Fields
<!-- info: List and describe the fields present in the dataset. -->
<!-- scope: telescope -->
A data instance has the following fields:
- `concepts`: a `list` of `string` values denoting the concept the system should write about. Has 3 to 5 items, constitutes the `input` of the task.
- `target`: a sentence `string` mentioning all of the above mentioned `concepts`. Constitutes the desired `output` of the task.
#### Example Instance
<!-- info: Provide a JSON formatted example of a typical instance in the dataset. -->
<!-- scope: periscope -->
```
[
{
"concepts": ['ski', 'mountain', 'skier'],
"target": 'Skier skis down the mountain',
},
{
"concepts": ['ski', 'mountain', 'skier'],
"target": 'Three skiers are skiing on a snowy mountain.',
},
]
```
#### Data Splits
<!-- info: Describe and name the splits in the dataset if there are more than one. -->
<!-- scope: periscope -->
Each example in the dataset consists of a set of 3 to 5 concepts denoted by a single noun, verb, or adjective (the input), and a sentence using these concepts (the output). The dataset provides several such sentences for each such concept.
| | Train | Dev | Test |
|---------------------------|--------|-------|-------|
| **Total concept-sets** | 32,651 | 993 | 1,497 |
| **Total sentences** | 67,389 | 4,018 | 6,042 |
|**Average sentence length**| 10.54 | 11.55 | 13.34 |
#### Splitting Criteria
<!-- info: Describe any criteria for splitting the data, if used. If there are differences between the splits (e.g., if the training annotations are machine-generated and the dev and test ones are created by humans, or if different numbers of annotators contributed to each example), describe them here. -->
<!-- scope: microscope -->
The dev and test set were created by sampling sets of concepts of size 4 or 5 (and as many of size 3 for the dev set) present in the source captioning datasets and having crowd-workers write reference sentences using these concepts.
Conversely, the training set has more concept sets of size 3 than of size 4 and 5, and uses the original captions from the source datasets as references.
The authors also ensured that the training, dev and test set have different combinations of unique concepts to ensure compositionality (details in [Table 1](https://arxiv.org/pdf/1911.03705v3.pdf)).
## Dataset in GEM
### Rationale for Inclusion in GEM
#### Why is the Dataset in GEM?
<!-- info: What does this dataset contribute toward better generation evaluation and why is it part of GEM? -->
<!-- scope: microscope -->
CommonGen is a medium sized corpus with a unique reasoning challenge and interesting evaluation possibilities.
#### Similar Datasets
<!-- info: Do other datasets for the high level task exist? -->
<!-- scope: telescope -->
no
#### Ability that the Dataset measures
<!-- info: What aspect of model ability can be measured with this dataset? -->
<!-- scope: periscope -->
Commonsense reasoning
### GEM-Specific Curation
#### Modificatied for GEM?
<!-- info: Has the GEM version of the dataset been modified in any way (data, processing, splits) from the original curated data? -->
<!-- scope: telescope -->
yes
#### GEM Modifications
<!-- info: What changes have been made to he original dataset? -->
<!-- scope: periscope -->
`other`
#### Modification Details
<!-- info: For each of these changes, described them in more details and provided the intended purpose of the modification -->
<!-- scope: microscope -->
4 challenge sets for CommenGen were added to the GEM evaluation suite.
#### Additional Splits?
<!-- info: Does GEM provide additional splits to the dataset? -->
<!-- scope: telescope -->
yes
#### Split Information
<!-- info: Describe how the new splits were created -->
<!-- scope: periscope -->
1. Data Shift
We created subsets of the training and development sets of ~500 randomly selected inputs each.
2. Transformations
We applied input scrambling on a subset of 500 randomly selected test instances; the order of the concepts was randomly reassigned.
3. Subpopulations
We created a subpopulation based on input length, taking into account the number of concepts the input test structures. By comparing inputs of different lengths, we can see to what extent systems are able to handle different input sizes
| Concept number | Frequency English |
|----------------|-------------------|
| 4 | 747 |
| 5 | 750 |
#### Split Motivation
<!-- info: What aspects of the model's generation capacities were the splits created to test? -->
<!-- scope: periscope -->
Generalization and Robustness
### Getting Started with the Task
#### Pointers to Resources
<!-- info: Getting started with in-depth research on the task. Add relevant pointers to resources that researchers can consult when they want to get started digging deeper into the task. -->
<!-- scope: microscope -->
- Two variants of [BART](https://arxiv.org/abs/1910.13461), [Knowledge Graph augemnted-BART](https://arxiv.org/abs/2009.12677) and [Enhanced Knowledge Injection Model for Commonsense Generation](https://arxiv.org/abs/2012.00366), hold the top two spots on the leaderboard, followed by a fine-tuned [T5 model](https://arxiv.org/abs/1910.10683).
- The following script shows how to download and load the data, fine-tune, and evaluate a model using the ROUGE, BLEU, and METEOR metrics: [GEM sample script](https://github.com/GEM-benchmark/GEM-baseline-models/blob/main/examples/GEM-common_gen.ipynb).
## Previous Results
### Previous Results
#### Measured Model Abilities
<!-- info: What aspect of model ability can be measured with this dataset? -->
<!-- scope: telescope -->
Commonsense Reasoning
#### Metrics
<!-- info: What metrics are typically used for this task? -->
<!-- scope: periscope -->
`Other: Other Metrics`, `BLEU`, `ROUGE`, `METEOR`
#### Other Metrics
<!-- info: Definitions of other metrics -->
<!-- scope: periscope -->
- SPICE: An evaluation metric for image captioning that is defined over scene graphs
- CIDEr: An n-gram overlap metric based on cosine similarity between the TF-IDF weighted ngram counts
#### Proposed Evaluation
<!-- info: List and describe the purpose of the metrics and evaluation methodology (including human evaluation) that the dataset creators used when introducing this task. -->
<!-- scope: microscope -->
The main metrics are captioning metrics since the original concept lists were extracted from captioning datasets. A human subject study with five graduate students was conducted and they were asked to rank the "commonsense plausibility" of two models at a time.
#### Previous results available?
<!-- info: Are previous results available? -->
<!-- scope: telescope -->
yes
#### Other Evaluation Approaches
<!-- info: What evaluation approaches have others used? -->
<!-- scope: periscope -->
The currently best performing model KFCNet (https://aclanthology.org/2021.findings-emnlp.249/) uses the same automatic evaluation but does not conduct any human evaluation.
#### Relevant Previous Results
<!-- info: What are the most relevant previous results for this task/dataset? -->
<!-- scope: microscope -->
The most relevant results can be seen on the [leaderboard](https://inklab.usc.edu/CommonGen/leaderboard.html)
## Dataset Curation
### Original Curation
#### Original Curation Rationale
<!-- info: Original curation rationale -->
<!-- scope: telescope -->
The dataset creators selected sets of concepts that appeared in image and video captions (as identified by a POS tagger) to ensure that a likely real-world scenario including the set could be imagined and constructed. Section 3.1 of the [paper](https://arxiv.org/pdf/1911.03705v3.pdf) describes a sampling scheme which encourages diversity of sets while selecting common concepts.
#### Communicative Goal
<!-- info: What was the communicative goal? -->
<!-- scope: periscope -->
The speaker is required to produce a *coherent* sentence which mentions all of the source concepts, and which describes a *likely* situation that could be captured in a picture or video.
#### Sourced from Different Sources
<!-- info: Is the dataset aggregated from different data sources? -->
<!-- scope: telescope -->
yes
#### Source Details
<!-- info: List the sources (one per line) -->
<!-- scope: periscope -->
- [Flickr30k](https://www.mitpressjournals.org/doi/abs/10.1162/tacl_a_00166)
- [MSCOCO](https://link.springer.com/chapter/10.1007/978-3-319-10602-1_48)
- [Conceptual Captions](https://www.aclweb.org/anthology/P18-1238/)
- Video captioning datasets:
- [LSMDC](https://link.springer.com/article/10.1007/s11263-016-0987-1)
- [ActivityNet](https://openaccess.thecvf.com/content_iccv_2017/html/Krishna_Dense-Captioning_Events_in_ICCV_2017_paper.html)
- [VaTeX](https://openaccess.thecvf.com/content_ICCV_2019/html/Wang_VaTeX_A_Large-Scale_High-Quality_Multilingual_Dataset_for_Video-and-Language_Research_ICCV_2019_paper.html)
### Language Data
#### How was Language Data Obtained?
<!-- info: How was the language data obtained? -->
<!-- scope: telescope -->
`Crowdsourced`
#### Where was it crowdsourced?
<!-- info: If crowdsourced, where from? -->
<!-- scope: periscope -->
`Amazon Mechanical Turk`
#### Language Producers
<!-- info: What further information do we have on the language producers? -->
<!-- scope: microscope -->
The training data consists of concept sets and captions for the source datasets. The concept sets are the sets of labels of the images or videos, selected with a heuristic to maximize diversity while ensuring that they represent likely scenarios.
The dev and test set sentences were created by Amazon Mechanical Turk crowd workers. The workers were shown an example generation and a set of 4 or 5 concept names along with their part-of-speech and asked to write:
1. One sentence mentioning all of the concepts
2. A rationale explaining how the sentence connects the concept
A screenshot of the interface is provided in Figure 7 of the [Appendix](https://arxiv.org/pdf/1911.03705v3.pdf).
#### Topics Covered
<!-- info: Does the language in the dataset focus on specific topics? How would you describe them? -->
<!-- scope: periscope -->
Information was not provided.
#### Data Validation
<!-- info: Was the text validated by a different worker or a data curator? -->
<!-- scope: telescope -->
validated by data curator
#### Was Data Filtered?
<!-- info: Were text instances selected or filtered? -->
<!-- scope: telescope -->
algorithmically
#### Filter Criteria
<!-- info: What were the selection criteria? -->
<!-- scope: microscope -->
During the data collection, workers who provided rationales that were too short, failed to have good coverage of the input in their sentences, or workers whose output had a high perplexity under a GPT-2 model were disqualified from the pool and replaced with newcomers.
### Structured Annotations
#### Additional Annotations?
<!-- quick -->
<!-- info: Does the dataset have additional annotations for each instance? -->
<!-- scope: telescope -->
none
#### Annotation Service?
<!-- info: Was an annotation service used? -->
<!-- scope: telescope -->
no
### Consent
#### Any Consent Policy?
<!-- info: Was there a consent policy involved when gathering the data? -->
<!-- scope: telescope -->
no
#### Justification for Using the Data
<!-- info: If not, what is the justification for reusing the data? -->
<!-- scope: microscope -->
The data was sourced from Mechanical Turk which means that raters were aware that their annotations may be publicly released for research purposes.
### Private Identifying Information (PII)
#### Contains PII?
<!-- quick -->
<!-- info: Does the source language data likely contain Personal Identifying Information about the data creators or subjects? -->
<!-- scope: telescope -->
no PII
#### Justification for no PII
<!-- info: Provide a justification for selecting `no PII` above. -->
<!-- scope: periscope -->
The concepts are restricted to verbs, adjectives, and common nouns, and no personal information is given in the captions.
### Maintenance
#### Any Maintenance Plan?
<!-- info: Does the original dataset have a maintenance plan? -->
<!-- scope: telescope -->
no
## Broader Social Context
### Previous Work on the Social Impact of the Dataset
#### Usage of Models based on the Data
<!-- info: Are you aware of cases where models trained on the task featured in this dataset ore related tasks have been used in automated systems? -->
<!-- scope: telescope -->
no
### Impact on Under-Served Communities
#### Addresses needs of underserved Communities?
<!-- info: Does this dataset address the needs of communities that are traditionally underserved in language technology, and particularly language generation technology? Communities may be underserved for exemple because their language, language variety, or social or geographical context is underepresented in NLP and NLG resources (datasets and models). -->
<!-- scope: telescope -->
no
### Discussion of Biases
#### Any Documented Social Biases?
<!-- info: Are there documented social biases in the dataset? Biases in this context are variations in the ways members of different social categories are represented that can have harmful downstream consequences for members of the more disadvantaged group. -->
<!-- scope: telescope -->
no
#### Are the Language Producers Representative of the Language?
<!-- info: Does the distribution of language producers in the dataset accurately represent the full distribution of speakers of the language world-wide? If not, how does it differ? -->
<!-- scope: periscope -->
The dataset is created using data from image captioning systems and might inherit some of the social biases represented therein (see e.g. [Tang et al. 2020](https://arxiv.org/abs/2006.08315)).
Another related concern is the exposure bias introduced by the initial selection of pictures and video, which are likely to over-represent situations that are common in the US at the expense of other parts of the world (Flickr, for example, is a US-based company founded in Canada). For more discussion of the potential impacts of exposure bias, see e.g. [The Social Impact of Natural Language Processing](https://www.aclweb.org/anthology/P16-2096.pdf).
## Considerations for Using the Data
### PII Risks and Liability
#### Potential PII Risk
<!-- info: Considering your answers to the PII part of the Data Curation Section, describe any potential privacy to the data subjects and creators risks when using the dataset. -->
<!-- scope: microscope -->
The concepts are restricted to verbs, adjectives, and common nouns, and no personal information is given in the captions.
### Licenses
#### Copyright Restrictions on the Dataset
<!-- info: Based on your answers in the Intended Use part of the Data Overview Section, which of the following best describe the copyright and licensing status of the dataset? -->
<!-- scope: periscope -->
`open license - commercial use allowed`
#### Copyright Restrictions on the Language Data
<!-- info: Based on your answers in the Language part of the Data Curation Section, which of the following best describe the copyright and licensing status of the underlying language data? -->
<!-- scope: periscope -->
`open license - commercial use allowed`
### Known Technical Limitations
#### Technical Limitations
<!-- info: Describe any known technical limitations, such as spurrious correlations, train/test overlap, annotation biases, or mis-annotations, and cite the works that first identified these limitations when possible. -->
<!-- scope: microscope -->
The dataset is in English, a language with an abundance of existing resources.
The use of GPT-2 to validate development ant test sentences [might be cause for similar concern](https://www.aclweb.org/anthology/D19-1339.pdf), but we do note that the authors only use the model to discount very high perplexity sequences which is less likely to surface those biases.
The language in the development and test set is crowdsourced, which means that it was written by workers whose main goal was speed. This is likely to impact the quality and variety of the targets. The population of crowdsource workers is also not identically distributed as the the base population of the locations the workers come from, which may lead to different representation of situations or underlying expectations of what these situations are.
#### Unsuited Applications
<!-- info: When using a model trained on this dataset in a setting where users or the public may interact with its predictions, what are some pitfalls to look out for? In particular, describe some applications of the general task featured in this dataset that its curation or properties make it less suitable for. -->
<!-- scope: microscope -->
Due to the overrepresentation of US-situations, the system may not work for users across the world. Moreover, only limited information on the dataset quality are provided and the system may fail as a result of unknown issues.
#### Discouraged Use Cases
<!-- info: What are some discouraged use cases of a model trained to maximize the proposed metrics on this dataset? In particular, think about settings where decisions made by a model that performs reasonably well on the metric my still have strong negative consequences for user or members of the public. -->
<!-- scope: microscope -->
Any system needs to be evaluated on a broader set of unseen concepts then provided in the dataset. Since the references for the test set are private, it is not known how well findings generalize beyond the collection methodology.
|
somosnlp/instruct-legal-refugiados-es-v3 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: answer
dtype: string
- name: prompt_es
dtype: string
- name: lang
dtype: string
- name: registro
dtype: string
- name: periodo
dtype: string
- name: dominio
dtype: string
- name: tarea
dtype: string
- name: pais_origen
dtype: string
splits:
- name: train
num_bytes: 61982265
num_examples: 9430
- name: test
num_bytes: 6179589
num_examples: 896
- name: full
num_bytes: 71946490
num_examples: 11105
download_size: 41438662
dataset_size: 140108344
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: full
path: data/full-*
---
<p align="center">
<img src="markus-winkler-Je1MDuITTF4-unsplash.jpg" style="width: 50%;">
</p>
<h1 align="center">Legal Refugiados: Un dataset para QA en temas legales de refugio, asilo y protección internacional.</h1>
# Spanish Instruction Question-Answering Legal Refugiados
## Dataset Description
Instruction Question-Answering Legal Refugiados es una colección de instrucciones extraídas de una gran cantidad de documentos legales del gobierno de España, principalmente, y de otras instituciones de la UE y también de otros países de habla hispana como México o Venezuela. Todos ellos están relacionados con leyes y disposiciones legales sobre ciudadanos refugiados o inmigrantes, sus derechos, su situación. Después de recopilar todos esos documentos, utilizamos la libreria [distillabel](https://distilabel.argilla.io/latest/) de [Argilla](https://argilla.io/) para crear un proceso de generación de un dataset sintético de tipo instrucciónes (Question-Answer) parea poder entrenar un modelo en español orientado a resolver cuestiones en el ambito de la ayuda a los refugiados y al asilo político.
---
*Spanish Instruct-Question Answering Legal Refugiados is a collection of instruction queries extracted from a lot of legal documents from the goverment of Spain, mainly, and other UE institutions and also other Spanish speaking countries like Mexico or Venezuela. They all are related to laws and dispositions about refugee or migrant citizens, their rights, their situation. After collecting all those documents, we use the library [distillabel](https://distilabel.argilla.io/latest/) by [Argilla](https://argilla.io/) to create a process to extract instruction format pairs of query-answer samples ion order to train a Spanish language model.*
### Dataset Summary
Compuesto por unos 11.100 registros que contienen los campos:
* instruction/answer: una instrucción o consulta.
* input/context: un contexto para resolver la consulta.
* output/answer: la salida generada a partir del contexto.
* prompt: Un prompt en ingles al estilo alpaca para pedir la salida dada la instrucción y la entrada.
* prompt_es: Un prompt en español estilo alpaca para pedir la salida dada la instrucción y la entrada
* source: tipo de nombre de la fuente original de donde se extrajo la entrada.
* page: número de página de la fuente
* idioma: es
---
*Contains about 11,100 rows containing the fields:*
* *instruction/answer: an instruction or query.*
* *input/context: a context to solve the query*
* *output/answer: the generated output from the context.*
* *prompt: A prompt in alpaca-style to ask for the output given the instruction and input.*
* *source: kind of the name of the orginal source where the input was extracted.*
* *page: page number of the source*
* *lang: es*
### Supported Tasks
Text-Generation
Question-Answering
### Languages
- Spanish (es)
## Dataset Structure
### Data Instances
Here is an example of an instance:
<pre>
{'prompt': 'Below is a question in Spanish paired with a context also in Spanish that provides further information to solve the question. Write a response that appropriately completes the request.\n\n### Question:\n¿Podrías explicar en qué principios básicos se fundamenta la Ley 5/1984, relativa al derecho de asilo y a la condición de refugiado, según el Real Decreto 203/1995?\n\n### Context:\nReal Decreto 203/1995, de 10 de febrero, por el que se aprueba el Reglamento de aplicación de la Ley 5/1984, ... que deben regir los procedimientos de inadmisión a trámite, tanto en frontera como en el interior del territorio.\n\n### Response:\n',
'instruction': '¿Podrías explicar en qué principios básicos se fundamenta la Ley 5/1984, relativa al derecho de asilo y a la condición de refugiado, según el Real Decreto 203/1995?',
'input': 'Real Decreto 203/1995, de 10 de febrero, por el que se aprueba el Reglamento de aplicación de la Ley 5/1984 ... deben regir los procedimientos de inadmisión a trámite, tanto en frontera como en el interior del territorio.',
'output': 'La Ley 5/1984, relativa al derecho de asilo y a la condición de refugiado, se basa en los siguientes principios fundamentales... garantías adecuadas durante los procedimientos de inadmisión a trámite, tanto en frontera como en el interior del territorio.',
'source': 'BOE_1995_5542_consolidado_asilo_y_refugiado',
'page': '1',
'source_ini': 0,
'source_fin': 1419}
</pre>
### Data Fields
<pre>
{
prompt: str
instruction: str
input: str
output: str
source: str,
page: int,
source_ini: int,
source_fin:int
}
</pre>
### Data Splits
| Split | Size |
| ------------- | ------------- |
| `train` | 11,100 |
| `train` | 11,100 |
## Dataset Creation
Para la creación del dataset nos hemos apoyado en la libreria [distillabel](https://distilabel.argilla.io/latest/) de [Argilla](https://argilla.io/) que proporciona un completo juego de herramientas que facilitan y agilizan enormemente esta labor de creación de un dataset sintético.
El proceso se ha dividido en 3 etapas:
1. Recolección de fuentes de datos, principalmente leyes orgánicas, reales decretos, así como documentación y procedimientos administrativos de la oficina de asilo y protección al refugiado en España. También se ha exrtyaido documentos de la UE y otros paises de habla hispana.
- Estos documentos en su mayoria en formato PDF, y otros en texto plano, se han procesado y limpiado ligeramente y posteriormente se han divido en bloques de 512 tokens, que serviran de base para el próximo paso.
2. Construcción del dataset sintético:
- Primer paso: En base a un contexto dado de 512 tokens proveniente de un documento legal, solicitabamos al LLM un par de instrucciones o preguntas que pudieran resolverse con dicho contexto. Usamos un tarea de tipo `SelfInstructTask` y proporcionamos a la tarea una descripción o rol de asistente IA. Invocamos al modelo "mistralai/Mixtral-8x7B-Instruct-v0.1" en un Inference Endpoint en Hugging Face. Como resultado de este proceso obteniamos 1-2 instrucciones o preguntas por cada contexto proporcionado.
- Segundo paso: Generamos un prompt en formato similar a Alpaca, donde dada una Instruccion (cada una de las respuestas del paso anterior) y un input o contexto (el correspondiente del paso anterior), se solicita un output.
- Tercer paso: En base a los prompts generados en el paso anterior, usando una tarea de tipo `TextGenerationTask` y con la descripción de aistente AI del paso 1, solicitamos al LLM que nos proporcione la respuesta. Nuevamente, usamos el modelo Mixtral en un Inference Endpoint de Hugging Face.
3. Limpieza, revisión y división del dataset:
El proceso de filtrado se dividió en 2 pasos:
- Primer paso: Filtrado de ejemplos cuyos outputs hicieran match con la expresión regular: "^if$|#|\^|~".
- Segundo paso: Filtrado de ejemplos con outputs por debajo de 25 tokens. Los tokens son el resultado de hacer split por espacios en blanco.
La selección del test set se dividió en 2 pasos:
- Primer paso: Se calculó la media y desviación estándar del número de tokens tanto en la instructions como en los outputs por separado.
- Segundo paso: Se seleccionaron los ejemplos cuyas instruction y outputs estuvieran dentro del límite "media +- 0,35*desviación_estándar".
Agradecer el soporte y la guia proporcionada por Argilla para poder llevar a cabo esta tarea.
biblioteca [distillabel](https://distilabel.argilla.io/latest/) de [Argilla](https://argilla.io/)
---
For the creation of the dataset we have used the [distillabel](https://distilabel.argilla.io/latest/) library by [Argilla](https://argilla.io/) that provides a complete set of tools that facilitate and speed up enormously this work.
The process has been divided into 3 stages:
1. Collection of data sources, mainly organic laws, royal decrees, as well as documentation and administrative procedures of the asylum and refugee protection office in Spain. Documents from the EU and other Spanish speaking countries have also been exrtyaido.
- These documents, mostly in PDF format, and others in plain text, have been lightly processed and cleaned and then divided into blocks of 512 tokens, which will serve as the basis for the next step.
2. Construction of the synthetic dataset:
- First step: based on a given context of 512 tokens coming from a legal document, we asked the LLM for a couple of instructions or questions that could be solved with that context. We used a task of type `SelfInstructTask` and provided the task with a description or role of an AI assistant. We invoked the model "mistralai/Mixtral-8x7B-Instruct-v0.1" in an Inference Endpoint in Hugging Face. As a result of this process we obtained 1-2 instructions or questions for each context provided.
- Second step: We generate a prompt in a format similar to Alpaca, where given an Instruction (each of the answers from the previous step) and an input or context (the corresponding one from the previous step), an output is requested.
- Third step: Based on the prompts generated in the previous step, using a task of type `TextGenerationTask` and with the AI assistant description from step 1, we request the LLM to provide us with the answer. Again, we use the Mixtral model in a Hugging Face Inference Endpoint.
3. Dataset cleanup, review and splitting:
The filtering process consisted of the following steps:
First step: Filter items whose outputs matched the regular expression: "^if$|#|\^|~".
Second step: Filter items whose outputs were under 25 tokens. Each token was the result of splitting the output by white space.
The selection of the test consisted of the following steps:
First step: The mean and standard deviation of the number of tokens for instruction and output were calculated separately.
Second step: Those items whose instructions and outputs were under the limit "mean +- 0.35*standard_deviation" were selected.
Thank you for the support and guidance provided by Argilla in order to accomplish this task.
### Source Data
A continuación, mostramos los nombre "aproximados" de los documentos empleados para extraer todos los contextos que incluye este dataset. Por dicho nombre se puede inferir las leyes o disposiciones legales que contienen:
---
Below, we show the "approximate" names of the documents used to extract all the contexts included in this dataset. By this name it is possible to infer the laws or legal provisions they contain:
**Documents**:
adhesion_espana_estatuto_refugiados_onu.pdf
BOE_1995_5542_consolidado_asilo_y_refugiado.pdf
BOE_2003_19714_consolidado_proteccion_temporal_afluencia_masiva_desplazados.pdf
BOE_decreto_1800_2008_consolidado_abono_acumulado_prestacion_desempleo_extranjeros.pdf
BOE_decreto_203_1995_consolidado_reglamento_derecho_asilo_refugiado.pdf
BOE_decreto_220_2022_consolidado_reglamento_acogida_proteccion_internacional.pdf
BOE_decreto_557_2011_consolidado_reglamento_derechos_libertades_extranjeros_espana.pdf
BOE_decreto_865_2001_consolidado_reconocimiento_estatuto_apatrida.pdf
BOE_ley-19_2021_ingreso_minimo_vital.pdf
BOE_leyordinaria_26_2015_mod_sistema_proteccion_infancia_adolescencia.pdf
BOE_leyorganica_8_2015_sistema_proteccion_infancia_adolescencia.pdf
BOE_leyorganica_8_2021_proteccion_integral_infancia_adolescencia_violencia.pdf
BOE_ley_organica_4_2000_consolidado_derechos_libertades_extranjeros_espana.pdf
BOE_orden_1282_2007_consolidado_medios_economicos_entrada_extranjeros_espana.pdf
BOE_orden_1283_2007_consolidado_requisitos_carta_invitacion_extranjeros.pdf
BOE_orden_1485_2021_regulacion_gestion_colectiva_contrataciones_origen.pdf
BOE_orden_1803_2011_importe_tasas_visados_doc_inmigracion_extranjeria.pdf
BOE_orden_3321_2011_expedicion_titulo_viaje_extranjeros.pdf
BOE_orden_867_2023-consolidado_oficina_estatal_lucha_discriminacion_trabajo_seg_social.pdf
carta_derechos_fundamentales_UE.pdf
constitucion_espanola_es_cat.txt
Convencion_1951_estatuto_refugiados.pdf
declaracion_foro_integracion_social_immigrantes_refugiadas_2023.pdf
decreto_1325_2003_24_octubre_régimen_protección_temporal_personas_desplazadas.txt
derecho_internacional_sobre_migracion_glosario_migracion_OIM.pdf
determinación_responsable_examen_solicitudes_asilo_UE_15_06_1990.txt
Dialnet-NormativaDeLaUnionEuropeaYLegislacionEspanolaSobre-5315869.pdf
directiva_2001_55_normas _mínimas_concesión_protección_ temporal_afluencia_masiva_desplazados_UE.txt
directiva_2011_95_UE_normas_requisitos_reconocimiento_proteccion_internacional.pdf
directiva_2013_32_procedimiento_concesion_retirada_proteccion_internacional.pdf
directiva_2013_33_normas_acogida_solicitantes_proteccion_internacional.pdf
ficheros_incluidos.txt
guiaderechos_victimas_violencia_genero_2022_2.pdf
guia_solicitantes_proteccion_internacional_en_italia.pdf
Ley_12_2009_30_10_reguladora_derecho_asilo_protección_subsidiaria.txt
Ley_de_Extranjería_de_España.pdf
ley_refugiados_asilados_venezuela.pdf
ley_refugiados_proteccion_complementaria_asilo_mexico.pdf
manual_derecho_europeo_asilo_fronteras_inmigracion_edicion_2020.pdf
policia_nacional_doc_solicitud_asilo_refugio.txt
politica_asilo_UE.pdf
proteccion_social_trabajadores_extranjeros_informe2023_94_F06.pdf
protección_internacional.txt
RDL_6_2022_medidas_urgentes_guerra_ucrania.pdf
reglamento_UE_L00031-00059_responsabilidad_examen_solicitud_proteccion.pdf
### Personal and Sensitive Information
No se incluye información personal o sensible.
---
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
Este dataset pretende ayudar al entrenamiento de modelos en español que puedan suponer un ayuda aquellas organizaciones o personas que se dediquen a dar soporte legal y ayuda a personas vulnerables y/o refugiadas de otros paises. Pudiendo ser base para crear modelos que puedan resolver tareas de question-answering directamente o bien ser usados como base para aplicaciones que extraigan respuestas o información a partir de documentos legales en este campo.
---
This dataset is intended to help the training of models in Spanish that can be of help to organizations or individuals who are dedicated to providing legal support and assistance to vulnerable people and / or refugees from other countries. It can be the basis for creating models that can solve question-answering tasks directly or be used as a basis for applications that extract answers or information from legal documents in this field.
### Discussion of Biases
No postprocessing steps were applied to mitigate potential social biases.
## Licensing information
This work is licensed under [Apache License Version 2.0, January 2004](https://www.apache.org/licenses/LICENSE-2.0) License.
## Citation Information
Dataset creado para el Hackaton #Somos600M de la comunidad SomosNLP en marzo de 2.024, por el equipo del proyecto QA Legal Refugiados.
Recolección y construcción: [Eduardo Muñoz](https://huggingface.co/edumunozsala)
Limpieza, revisión y división: [Teresa Martin](https://huggingface.co/narhim)
Colaboración: [Alvaro Hidalgo](https://huggingface.co/hacendado)
## Contributions
[N/A] |
jxie/flickr8k | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption_0
dtype: string
- name: caption_1
dtype: string
- name: caption_2
dtype: string
- name: caption_3
dtype: string
- name: caption_4
dtype: string
splits:
- name: train
num_bytes: 826721431.0
num_examples: 6000
- name: validation
num_bytes: 138017615.0
num_examples: 1000
- name: test
num_bytes: 136871307.0
num_examples: 1000
download_size: 274629589
dataset_size: 1101610353.0
---
# Dataset Card for "flickr8k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DONG19/CoT_code_instruction_dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 10819483
num_examples: 20022
download_size: 5581774
dataset_size: 10819483
---
# Dataset Card for "CoT_code_instruction_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mbgenai/concise_speech | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 9402539
num_examples: 14983
download_size: 5473476
dataset_size: 9402539
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/le_mars_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of le_mars/ル・マルス/勒马尔 (Azur Lane)
This is the dataset of le_mars/ル・マルス/勒马尔 (Azur Lane), containing 24 images and their tags.
The core tags of this character are `brown_hair, short_hair, blue_eyes, bow, breasts, ahoge, green_eyes, hair_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 30.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_mars_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 18.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_mars_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 48 | 33.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_mars_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 26.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_mars_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 48 | 48.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_mars_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/le_mars_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | looking_at_viewer, 1girl, solo, black_gloves, fingerless_gloves, shorts, smile, white_background, bare_shoulders, blush, cannon, full_body, hair_ornament, holding_weapon, machinery, navel, rigging, thighhighs, turret, bangs, closed_mouth, dark-skinned_female, simple_background, standing, sword |
| 1 | 6 |  |  |  |  |  | double_bun, looking_at_viewer, open_mouth, smile, 1girl, blue_bikini, solo, ;d, antenna_hair, ass, blush, hood, innertube, one-piece_tan, one_eye_closed, small_breasts, torpedo, water, barefoot, beachball, blue_sky, cloud, day, official_alternate_costume, outdoors, polka_dot_bikini, wrist_scrunchie |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | 1girl | solo | black_gloves | fingerless_gloves | shorts | smile | white_background | bare_shoulders | blush | cannon | full_body | hair_ornament | holding_weapon | machinery | navel | rigging | thighhighs | turret | bangs | closed_mouth | dark-skinned_female | simple_background | standing | sword | double_bun | open_mouth | blue_bikini | ;d | antenna_hair | ass | hood | innertube | one-piece_tan | one_eye_closed | small_breasts | torpedo | water | barefoot | beachball | blue_sky | cloud | day | official_alternate_costume | outdoors | polka_dot_bikini | wrist_scrunchie |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:-------|:---------------|:--------------------|:---------|:--------|:-------------------|:-----------------|:--------|:---------|:------------|:----------------|:-----------------|:------------|:--------|:----------|:-------------|:---------|:--------|:---------------|:----------------------|:--------------------|:-----------|:--------|:-------------|:-------------|:--------------|:-----|:---------------|:------|:-------|:------------|:----------------|:-----------------|:----------------|:----------|:--------|:-----------|:------------|:-----------|:--------|:------|:-----------------------------|:-----------|:-------------------|:------------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | | | X | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
HachiML/truthful_qa-ja-v0.3_forcheck | ---
dataset_info:
config_name: generation
features:
- name: id
dtype: int64
- name: type
dtype: string
- name: category
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
- name: source
dtype: string
- name: question_en
dtype: string
- name: best_answer_en
dtype: string
- name: correct_answers_en
sequence: string
- name: incorrect_answers_en
sequence: string
- name: meta
struct:
- name: kenlm_score
struct:
- name: best_answer
dtype: float64
- name: correct_answers
sequence: float64
- name: incorrect_answers
sequence: float64
- name: question
dtype: float64
splits:
- name: validation
num_bytes: 851812.0636474908
num_examples: 673
download_size: 442750
dataset_size: 851812.0636474908
configs:
- config_name: generation
data_files:
- split: validation
path: generation/validation-*
---
# Dataset Card for "truthful_qa-ja-v0.3_forcheck"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sengunsipahi/civitai_top_10000_images | ---
license: unknown
---
|
open-llm-leaderboard/details_KnutJaegersberg__Deita-1_8B | ---
pretty_name: Evaluation run of KnutJaegersberg/Deita-1_8B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/Deita-1_8B](https://huggingface.co/KnutJaegersberg/Deita-1_8B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Deita-1_8B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-17T18:22:52.012956](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deita-1_8B/blob/main/results_2024-01-17T18-22-52.012956.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4513695685616451,\n\
\ \"acc_stderr\": 0.03473174413713779,\n \"acc_norm\": 0.4572369573723266,\n\
\ \"acc_norm_stderr\": 0.035511285827617124,\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299979,\n \"mc2\": 0.4002214148044727,\n\
\ \"mc2_stderr\": 0.014908452990717655\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.32081911262798635,\n \"acc_stderr\": 0.013640943091946522,\n\
\ \"acc_norm\": 0.3651877133105802,\n \"acc_norm_stderr\": 0.014070265519268802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4574785899223262,\n\
\ \"acc_stderr\": 0.004971704917267752,\n \"acc_norm\": 0.6062537343158734,\n\
\ \"acc_norm_stderr\": 0.004875812021461993\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4830188679245283,\n \"acc_stderr\": 0.030755120364119898,\n\
\ \"acc_norm\": 0.4830188679245283,\n \"acc_norm_stderr\": 0.030755120364119898\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.03784271932887467,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.03784271932887467\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383889,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383889\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n\
\ \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n\
\ \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.038956580652718446,\n\
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.038956580652718446\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5303030303030303,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.5303030303030303,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6010362694300518,\n \"acc_stderr\": 0.03533999094065696,\n\
\ \"acc_norm\": 0.6010362694300518,\n \"acc_norm_stderr\": 0.03533999094065696\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.02512465352588513,\n\
\ \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.02512465352588513\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184405,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184405\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4495798319327731,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.4495798319327731,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5614678899082569,\n \"acc_stderr\": 0.021274713073954572,\n \"\
acc_norm\": 0.5614678899082569,\n \"acc_norm_stderr\": 0.021274713073954572\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5147058823529411,\n \"acc_stderr\": 0.03507793834791324,\n \"\
acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03507793834791324\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6033755274261603,\n \"acc_stderr\": 0.03184399873811224,\n \
\ \"acc_norm\": 0.6033755274261603,\n \"acc_norm_stderr\": 0.03184399873811224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.03901591825836184,\n\
\ \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.03901591825836184\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6709401709401709,\n\
\ \"acc_stderr\": 0.03078232157768817,\n \"acc_norm\": 0.6709401709401709,\n\
\ \"acc_norm_stderr\": 0.03078232157768817\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5504469987228607,\n\
\ \"acc_stderr\": 0.017788725283507337,\n \"acc_norm\": 0.5504469987228607,\n\
\ \"acc_norm_stderr\": 0.017788725283507337\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.48265895953757226,\n \"acc_stderr\": 0.026902900458666647,\n\
\ \"acc_norm\": 0.48265895953757226,\n \"acc_norm_stderr\": 0.026902900458666647\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.01444415780826144,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.01444415780826144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.02840830202033269,\n\
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.02840830202033269\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5048231511254019,\n\
\ \"acc_stderr\": 0.028396770444111298,\n \"acc_norm\": 0.5048231511254019,\n\
\ \"acc_norm_stderr\": 0.028396770444111298\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.45987654320987653,\n \"acc_stderr\": 0.02773102275353928,\n\
\ \"acc_norm\": 0.45987654320987653,\n \"acc_norm_stderr\": 0.02773102275353928\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3324641460234681,\n\
\ \"acc_stderr\": 0.012032022332260507,\n \"acc_norm\": 0.3324641460234681,\n\
\ \"acc_norm_stderr\": 0.012032022332260507\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4007352941176471,\n \"acc_stderr\": 0.029768263528933102,\n\
\ \"acc_norm\": 0.4007352941176471,\n \"acc_norm_stderr\": 0.029768263528933102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4019607843137255,\n \"acc_stderr\": 0.019835176484375376,\n \
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.019835176484375376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n\
\ \"acc_stderr\": 0.034875586404620636,\n \"acc_norm\": 0.582089552238806,\n\
\ \"acc_norm_stderr\": 0.034875586404620636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5497076023391813,\n \"acc_stderr\": 0.038158273659132366,\n\
\ \"acc_norm\": 0.5497076023391813,\n \"acc_norm_stderr\": 0.038158273659132366\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299979,\n \"mc2\": 0.4002214148044727,\n\
\ \"mc2_stderr\": 0.014908452990717655\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5935280189423836,\n \"acc_stderr\": 0.013804448697753376\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1561789234268385,\n \
\ \"acc_stderr\": 0.00999950936975745\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/Deita-1_8B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|arc:challenge|25_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|gsm8k|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hellaswag|10_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T18-22-52.012956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T18-22-52.012956.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- '**/details_harness|winogrande|5_2024-01-17T18-22-52.012956.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-17T18-22-52.012956.parquet'
- config_name: results
data_files:
- split: 2024_01_17T18_22_52.012956
path:
- results_2024-01-17T18-22-52.012956.parquet
- split: latest
path:
- results_2024-01-17T18-22-52.012956.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/Deita-1_8B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deita-1_8B](https://huggingface.co/KnutJaegersberg/Deita-1_8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Deita-1_8B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T18:22:52.012956](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deita-1_8B/blob/main/results_2024-01-17T18-22-52.012956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4513695685616451,
"acc_stderr": 0.03473174413713779,
"acc_norm": 0.4572369573723266,
"acc_norm_stderr": 0.035511285827617124,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299979,
"mc2": 0.4002214148044727,
"mc2_stderr": 0.014908452990717655
},
"harness|arc:challenge|25": {
"acc": 0.32081911262798635,
"acc_stderr": 0.013640943091946522,
"acc_norm": 0.3651877133105802,
"acc_norm_stderr": 0.014070265519268802
},
"harness|hellaswag|10": {
"acc": 0.4574785899223262,
"acc_stderr": 0.004971704917267752,
"acc_norm": 0.6062537343158734,
"acc_norm_stderr": 0.004875812021461993
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4830188679245283,
"acc_stderr": 0.030755120364119898,
"acc_norm": 0.4830188679245283,
"acc_norm_stderr": 0.030755120364119898
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.375,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.03784271932887467,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.03784271932887467
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383889,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383889
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.038956580652718446,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.038956580652718446
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5303030303030303,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.5303030303030303,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6010362694300518,
"acc_stderr": 0.03533999094065696,
"acc_norm": 0.6010362694300518,
"acc_norm_stderr": 0.03533999094065696
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.02512465352588513,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.02512465352588513
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184405,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4495798319327731,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.4495798319327731,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5614678899082569,
"acc_stderr": 0.021274713073954572,
"acc_norm": 0.5614678899082569,
"acc_norm_stderr": 0.021274713073954572
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03507793834791324,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03507793834791324
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6033755274261603,
"acc_stderr": 0.03184399873811224,
"acc_norm": 0.6033755274261603,
"acc_norm_stderr": 0.03184399873811224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44171779141104295,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.44171779141104295,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6709401709401709,
"acc_stderr": 0.03078232157768817,
"acc_norm": 0.6709401709401709,
"acc_norm_stderr": 0.03078232157768817
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5504469987228607,
"acc_stderr": 0.017788725283507337,
"acc_norm": 0.5504469987228607,
"acc_norm_stderr": 0.017788725283507337
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.48265895953757226,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.48265895953757226,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.01444415780826144,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.01444415780826144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.02840830202033269,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.02840830202033269
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5048231511254019,
"acc_stderr": 0.028396770444111298,
"acc_norm": 0.5048231511254019,
"acc_norm_stderr": 0.028396770444111298
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.45987654320987653,
"acc_stderr": 0.02773102275353928,
"acc_norm": 0.45987654320987653,
"acc_norm_stderr": 0.02773102275353928
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3324641460234681,
"acc_stderr": 0.012032022332260507,
"acc_norm": 0.3324641460234681,
"acc_norm_stderr": 0.012032022332260507
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4007352941176471,
"acc_stderr": 0.029768263528933102,
"acc_norm": 0.4007352941176471,
"acc_norm_stderr": 0.029768263528933102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.019835176484375376,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.019835176484375376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.582089552238806,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.582089552238806,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5497076023391813,
"acc_stderr": 0.038158273659132366,
"acc_norm": 0.5497076023391813,
"acc_norm_stderr": 0.038158273659132366
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299979,
"mc2": 0.4002214148044727,
"mc2_stderr": 0.014908452990717655
},
"harness|winogrande|5": {
"acc": 0.5935280189423836,
"acc_stderr": 0.013804448697753376
},
"harness|gsm8k|5": {
"acc": 0.1561789234268385,
"acc_stderr": 0.00999950936975745
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kaleemWaheed/twitter_dataset_1713012544 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 8734
num_examples: 23
download_size: 8548
dataset_size: 8734
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
datablations/c4-filter-small | ---
dataset_info:
features:
- name: text
dtype: string
- name: timestamp
dtype: string
- name: url
dtype: string
- name: meta
struct:
- name: perplexity_score
dtype: float64
- name: text_length
dtype: int64
- name: domain
dtype: 'null'
- name: perplexity
dtype: float64
- name: dup_ratio
dtype: float64
- name: pairs
sequence:
sequence: int64
- name: repetitions
sequence: binary
- name: cluster
sequence: int64
splits:
- name: train
num_bytes: 236459743
num_examples: 100000
download_size: 140935431
dataset_size: 236459743
---
# Dataset Card for "small-c4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bezzam/DigiCam-Mirflickr-SingleMask-25K | ---
license: mit
dataset_info:
features:
- name: lensless
dtype: image
- name: lensed
dtype: image
splits:
- name: train
num_bytes: 10018033680.25
num_examples: 21250
- name: test
num_bytes: 1770951479.25
num_examples: 3750
download_size: 11881776054
dataset_size: 11788985159.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_tushar310__Hippy-AAI-7B | ---
pretty_name: Evaluation run of tushar310/Hippy-AAI-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [tushar310/Hippy-AAI-7B](https://huggingface.co/tushar310/Hippy-AAI-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tushar310__Hippy-AAI-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-14T17:04:40.503457](https://huggingface.co/datasets/open-llm-leaderboard/details_tushar310__Hippy-AAI-7B/blob/main/results_2024-03-14T17-04-40.503457.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.656531979981034,\n\
\ \"acc_stderr\": 0.031962680832224005,\n \"acc_norm\": 0.6564386700118126,\n\
\ \"acc_norm_stderr\": 0.03262191996940871,\n \"mc1\": 0.5703794369645043,\n\
\ \"mc1_stderr\": 0.017329234580409095,\n \"mc2\": 0.7194654727455502,\n\
\ \"mc2_stderr\": 0.01461143462108703\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6860068259385665,\n \"acc_stderr\": 0.013562691224726293,\n\
\ \"acc_norm\": 0.7158703071672355,\n \"acc_norm_stderr\": 0.013179442447653886\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7046405098585939,\n\
\ \"acc_stderr\": 0.004552718360513099,\n \"acc_norm\": 0.8807010555666202,\n\
\ \"acc_norm_stderr\": 0.0032347749806479515\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.01501446249716859,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.01501446249716859\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4558659217877095,\n\
\ \"acc_stderr\": 0.016657229424586313,\n \"acc_norm\": 0.4558659217877095,\n\
\ \"acc_norm_stderr\": 0.016657229424586313\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.47522816166883963,\n \"acc_stderr\": 0.012754553719781753,\n\
\ \"acc_norm\": 0.47522816166883963,\n \"acc_norm_stderr\": 0.012754553719781753\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\"\
: {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n\
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5703794369645043,\n\
\ \"mc1_stderr\": 0.017329234580409095,\n \"mc2\": 0.7194654727455502,\n\
\ \"mc2_stderr\": 0.01461143462108703\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918742\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \
\ \"acc_stderr\": 0.012607137125693639\n }\n}\n```"
repo_url: https://huggingface.co/tushar310/Hippy-AAI-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|arc:challenge|25_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|arc:challenge|25_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|arc:challenge|25_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|gsm8k|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|gsm8k|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|gsm8k|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hellaswag|10_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hellaswag|10_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hellaswag|10_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T16-34-56.130679.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T16-52-41.281740.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T17-04-40.503457.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T17-04-40.503457.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- '**/details_harness|winogrande|5_2024-03-14T16-34-56.130679.parquet'
- split: 2024_03_14T16_52_41.281740
path:
- '**/details_harness|winogrande|5_2024-03-14T16-52-41.281740.parquet'
- split: 2024_03_14T17_04_40.503457
path:
- '**/details_harness|winogrande|5_2024-03-14T17-04-40.503457.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-14T17-04-40.503457.parquet'
- config_name: results
data_files:
- split: 2024_03_14T16_34_56.130679
path:
- results_2024-03-14T16-34-56.130679.parquet
- split: 2024_03_14T16_52_41.281740
path:
- results_2024-03-14T16-52-41.281740.parquet
- split: 2024_03_14T17_04_40.503457
path:
- results_2024-03-14T17-04-40.503457.parquet
- split: latest
path:
- results_2024-03-14T17-04-40.503457.parquet
---
# Dataset Card for Evaluation run of tushar310/Hippy-AAI-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tushar310/Hippy-AAI-7B](https://huggingface.co/tushar310/Hippy-AAI-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tushar310__Hippy-AAI-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-14T17:04:40.503457](https://huggingface.co/datasets/open-llm-leaderboard/details_tushar310__Hippy-AAI-7B/blob/main/results_2024-03-14T17-04-40.503457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.656531979981034,
"acc_stderr": 0.031962680832224005,
"acc_norm": 0.6564386700118126,
"acc_norm_stderr": 0.03262191996940871,
"mc1": 0.5703794369645043,
"mc1_stderr": 0.017329234580409095,
"mc2": 0.7194654727455502,
"mc2_stderr": 0.01461143462108703
},
"harness|arc:challenge|25": {
"acc": 0.6860068259385665,
"acc_stderr": 0.013562691224726293,
"acc_norm": 0.7158703071672355,
"acc_norm_stderr": 0.013179442447653886
},
"harness|hellaswag|10": {
"acc": 0.7046405098585939,
"acc_stderr": 0.004552718360513099,
"acc_norm": 0.8807010555666202,
"acc_norm_stderr": 0.0032347749806479515
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.01501446249716859,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.01501446249716859
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4558659217877095,
"acc_stderr": 0.016657229424586313,
"acc_norm": 0.4558659217877095,
"acc_norm_stderr": 0.016657229424586313
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5703794369645043,
"mc1_stderr": 0.017329234580409095,
"mc2": 0.7194654727455502,
"mc2_stderr": 0.01461143462108703
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918742
},
"harness|gsm8k|5": {
"acc": 0.7012888551933283,
"acc_stderr": 0.012607137125693639
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
infCapital/vietllama-tiny-envi | ---
license: apache-2.0
task_categories:
- question-answering
language:
- vi
- en
---
+ Instruction dataset for fine-tuning
+ Dataset contains original dataset [lima, orca-mini, alpaca data, alpaca finance, GPTeacher] and their Vietnamese translations
+ Suggested use cases: Fine-tuning Vietnamese LLM |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.