datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
waleedfarooq51/my_dataset | ---
language:
- en
--- |
CyberHarem/friedrich_eckoldt_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of friedrich_eckoldt/Z16 (Azur Lane)
This is the dataset of friedrich_eckoldt/Z16 (Azur Lane), containing 11 images and their tags.
The core tags of this character are `black_hair, multicolored_hair, red_eyes, streaked_hair, bangs, breasts, long_hair, white_hair, horns, x-shaped_pupils, symbol-shaped_pupils, two-tone_hair, hair_between_eyes, v-shaped_eyebrows`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 15.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_eckoldt_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 8.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_eckoldt_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 26 | 17.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_eckoldt_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 12.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_eckoldt_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 26 | 26.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_eckoldt_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/friedrich_eckoldt_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, navel, black_jacket, bare_shoulders, crop_top, long_sleeves, midriff, stomach, black_thighhighs, iron_cross, off_shoulder, standing, thigh_strap, white_panties, cowboy_shot, open_mouth, red_gloves, simple_background, skindentation, thighs, white_shirt, :d, black_footwear, black_gloves, full_body, open_jacket, sharp_teeth, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | navel | black_jacket | bare_shoulders | crop_top | long_sleeves | midriff | stomach | black_thighhighs | iron_cross | off_shoulder | standing | thigh_strap | white_panties | cowboy_shot | open_mouth | red_gloves | simple_background | skindentation | thighs | white_shirt | :d | black_footwear | black_gloves | full_body | open_jacket | sharp_teeth | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:---------------|:-----------------|:-----------|:---------------|:----------|:----------|:-------------------|:-------------|:---------------|:-----------|:--------------|:----------------|:--------------|:-------------|:-------------|:--------------------|:----------------|:---------|:--------------|:-----|:-----------------|:---------------|:------------|:--------------|:--------------|:-------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-3-3 | ---
pretty_name: Evaluation run of juhwanlee/gemma-7B-alpaca-case-3-3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [juhwanlee/gemma-7B-alpaca-case-3-3](https://huggingface.co/juhwanlee/gemma-7B-alpaca-case-3-3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-3-3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T18:28:45.461246](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-3-3/blob/main/results_2024-03-27T18-28-45.461246.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2468629397241803,\n\
\ \"acc_stderr\": 0.03046150741368878,\n \"acc_norm\": 0.24759994193885335,\n\
\ \"acc_norm_stderr\": 0.03127181998068886,\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.01453786760130114,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\"\
: 0.2030716723549488,\n \"acc_stderr\": 0.011755899303705582,\n \"\
acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.012653835621466646\n },\n \
\ \"harness|hellaswag|10\": {\n \"acc\": 0.2550288787094204,\n \"\
acc_stderr\": 0.0043498663760689815,\n \"acc_norm\": 0.2621987651862179,\n\
\ \"acc_norm_stderr\": 0.00438931274801215\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.037385206761196665,\n\
\ \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.037385206761196665\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.16,\n \"acc_stderr\": 0.036845294917747094,\n \"acc_norm\"\
: 0.16,\n \"acc_norm_stderr\": 0.036845294917747094\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n\
\ \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518752,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518752\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918424,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918424\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\
\ \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.25161290322580643,\n\
\ \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.03210494433751458,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.03210494433751458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22201834862385322,\n \"acc_stderr\": 0.01781884956479663,\n \"\
acc_norm\": 0.22201834862385322,\n \"acc_norm_stderr\": 0.01781884956479663\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.17592592592592593,\n \"acc_stderr\": 0.02596742095825853,\n \"\
acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.02596742095825853\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.0257700156442904,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.0257700156442904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27053455019556716,\n\
\ \"acc_stderr\": 0.011345996743539264,\n \"acc_norm\": 0.27053455019556716,\n\
\ \"acc_norm_stderr\": 0.011345996743539264\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16544117647058823,\n \"acc_stderr\": 0.022571771025494767,\n\
\ \"acc_norm\": 0.16544117647058823,\n \"acc_norm_stderr\": 0.022571771025494767\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.01453786760130114,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5043409629044988,\n\
\ \"acc_stderr\": 0.014051956064076906\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/juhwanlee/gemma-7B-alpaca-case-3-3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|arc:challenge|25_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|gsm8k|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hellaswag|10_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-28-45.461246.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T18-28-45.461246.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- '**/details_harness|winogrande|5_2024-03-27T18-28-45.461246.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T18-28-45.461246.parquet'
- config_name: results
data_files:
- split: 2024_03_27T18_28_45.461246
path:
- results_2024-03-27T18-28-45.461246.parquet
- split: latest
path:
- results_2024-03-27T18-28-45.461246.parquet
---
# Dataset Card for Evaluation run of juhwanlee/gemma-7B-alpaca-case-3-3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [juhwanlee/gemma-7B-alpaca-case-3-3](https://huggingface.co/juhwanlee/gemma-7B-alpaca-case-3-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-3-3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T18:28:45.461246](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-3-3/blob/main/results_2024-03-27T18-28-45.461246.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2468629397241803,
"acc_stderr": 0.03046150741368878,
"acc_norm": 0.24759994193885335,
"acc_norm_stderr": 0.03127181998068886,
"mc1": 0.2215422276621787,
"mc1_stderr": 0.01453786760130114,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.2030716723549488,
"acc_stderr": 0.011755899303705582,
"acc_norm": 0.25,
"acc_norm_stderr": 0.012653835621466646
},
"harness|hellaswag|10": {
"acc": 0.2550288787094204,
"acc_stderr": 0.0043498663760689815,
"acc_norm": 0.2621987651862179,
"acc_norm_stderr": 0.00438931274801215
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.037385206761196665,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.037385206761196665
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.036845294917747094,
"acc_norm": 0.16,
"acc_norm_stderr": 0.036845294917747094
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518752,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518752
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918424,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.03210494433751458,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.03210494433751458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22201834862385322,
"acc_stderr": 0.01781884956479663,
"acc_norm": 0.22201834862385322,
"acc_norm_stderr": 0.01781884956479663
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.02596742095825853,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.02596742095825853
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.0257700156442904,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.0257700156442904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27053455019556716,
"acc_stderr": 0.011345996743539264,
"acc_norm": 0.27053455019556716,
"acc_norm_stderr": 0.011345996743539264
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16544117647058823,
"acc_stderr": 0.022571771025494767,
"acc_norm": 0.16544117647058823,
"acc_norm_stderr": 0.022571771025494767
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2215422276621787,
"mc1_stderr": 0.01453786760130114,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.014051956064076906
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Repton/testing_embeddings | ---
license: mit
---
|
RITESHRAJ/Ring-FRC | ---
license: apache-2.0
---
|
htdung167/common-voice-15-preprocessed | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: original_sentence
dtype: string
- name: preprocessed_sentence
dtype: string
splits:
- name: train
num_bytes: 93566852.04
num_examples: 2835
- name: test
num_bytes: 34744852.9
num_examples: 1290
download_size: 112056156
dataset_size: 128311704.94
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AdapterOcean/code_instructions_standardized_cluster_15_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 6283037
num_examples: 3567
download_size: 2846492
dataset_size: 6283037
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_15_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
iulusoy/test-datasetdict-2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
splits:
- name: train
num_bytes: 1074806
num_examples: 8530
- name: validation
num_bytes: 134675
num_examples: 1066
- name: test
num_bytes: 135968
num_examples: 1066
download_size: 881052
dataset_size: 1345449
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for "test-datasetdict-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_eldogbbhed__NeuralMonarchCoderPearlBeagle-T3Q-Mistral-Orca-Math-DPO-7b | ---
pretty_name: Evaluation run of eldogbbhed/NeuralMonarchCoderPearlBeagle-T3Q-Mistral-Orca-Math-DPO-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eldogbbhed/NeuralMonarchCoderPearlBeagle-T3Q-Mistral-Orca-Math-DPO-7b](https://huggingface.co/eldogbbhed/NeuralMonarchCoderPearlBeagle-T3Q-Mistral-Orca-Math-DPO-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eldogbbhed__NeuralMonarchCoderPearlBeagle-T3Q-Mistral-Orca-Math-DPO-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T00:53:52.237535](https://huggingface.co/datasets/open-llm-leaderboard/details_eldogbbhed__NeuralMonarchCoderPearlBeagle-T3Q-Mistral-Orca-Math-DPO-7b/blob/main/results_2024-03-22T00-53-52.237535.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6549872015506653,\n\
\ \"acc_stderr\": 0.03208964399399072,\n \"acc_norm\": 0.6547126636066226,\n\
\ \"acc_norm_stderr\": 0.03275569554392501,\n \"mc1\": 0.5556915544675642,\n\
\ \"mc1_stderr\": 0.01739458625074318,\n \"mc2\": 0.7145466676124104,\n\
\ \"mc2_stderr\": 0.014492892683432654\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068079,\n\
\ \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428171\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7013543118900617,\n\
\ \"acc_stderr\": 0.004567287775700566,\n \"acc_norm\": 0.882194781915953,\n\
\ \"acc_norm_stderr\": 0.003217184906847944\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.03514942551267439,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.03514942551267439\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400352,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400352\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.02366421667164251,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.02366421667164251\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n\
\ \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4033519553072626,\n\
\ \"acc_stderr\": 0.016407123032195253,\n \"acc_norm\": 0.4033519553072626,\n\
\ \"acc_norm_stderr\": 0.016407123032195253\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083133,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083133\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.01882421951270621,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.01882421951270621\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5556915544675642,\n\
\ \"mc1_stderr\": 0.01739458625074318,\n \"mc2\": 0.7145466676124104,\n\
\ \"mc2_stderr\": 0.014492892683432654\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.824782951854775,\n \"acc_stderr\": 0.010684179227706167\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7103866565579985,\n \
\ \"acc_stderr\": 0.012493927348659629\n }\n}\n```"
repo_url: https://huggingface.co/eldogbbhed/NeuralMonarchCoderPearlBeagle-T3Q-Mistral-Orca-Math-DPO-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-53-52.237535.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-53-52.237535.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- '**/details_harness|winogrande|5_2024-03-22T00-53-52.237535.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T00-53-52.237535.parquet'
- config_name: results
data_files:
- split: 2024_03_22T00_53_52.237535
path:
- results_2024-03-22T00-53-52.237535.parquet
- split: latest
path:
- results_2024-03-22T00-53-52.237535.parquet
---
# Dataset Card for Evaluation run of eldogbbhed/NeuralMonarchCoderPearlBeagle-T3Q-Mistral-Orca-Math-DPO-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eldogbbhed/NeuralMonarchCoderPearlBeagle-T3Q-Mistral-Orca-Math-DPO-7b](https://huggingface.co/eldogbbhed/NeuralMonarchCoderPearlBeagle-T3Q-Mistral-Orca-Math-DPO-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eldogbbhed__NeuralMonarchCoderPearlBeagle-T3Q-Mistral-Orca-Math-DPO-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T00:53:52.237535](https://huggingface.co/datasets/open-llm-leaderboard/details_eldogbbhed__NeuralMonarchCoderPearlBeagle-T3Q-Mistral-Orca-Math-DPO-7b/blob/main/results_2024-03-22T00-53-52.237535.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6549872015506653,
"acc_stderr": 0.03208964399399072,
"acc_norm": 0.6547126636066226,
"acc_norm_stderr": 0.03275569554392501,
"mc1": 0.5556915544675642,
"mc1_stderr": 0.01739458625074318,
"mc2": 0.7145466676124104,
"mc2_stderr": 0.014492892683432654
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.013592431519068079,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428171
},
"harness|hellaswag|10": {
"acc": 0.7013543118900617,
"acc_stderr": 0.004567287775700566,
"acc_norm": 0.882194781915953,
"acc_norm_stderr": 0.003217184906847944
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267439,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267439
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400352,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400352
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.02366421667164251,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.02366421667164251
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6820512820512821,
"acc_stderr": 0.023610884308927865,
"acc_norm": 0.6820512820512821,
"acc_norm_stderr": 0.023610884308927865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4033519553072626,
"acc_stderr": 0.016407123032195253,
"acc_norm": 0.4033519553072626,
"acc_norm_stderr": 0.016407123032195253
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083133,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.01882421951270621,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.01882421951270621
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5556915544675642,
"mc1_stderr": 0.01739458625074318,
"mc2": 0.7145466676124104,
"mc2_stderr": 0.014492892683432654
},
"harness|winogrande|5": {
"acc": 0.824782951854775,
"acc_stderr": 0.010684179227706167
},
"harness|gsm8k|5": {
"acc": 0.7103866565579985,
"acc_stderr": 0.012493927348659629
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MicPie/unpredictable_msdn-microsoft-com | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-msdn-microsoft-com
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-msdn-microsoft-com" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
liuyanchen1015/MULTI_VALUE_cola_participle_past_tense | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 7457
num_examples: 110
- name: test
num_bytes: 6825
num_examples: 102
- name: train
num_bytes: 63782
num_examples: 915
download_size: 40480
dataset_size: 78064
---
# Dataset Card for "MULTI_VALUE_cola_participle_past_tense"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
micsell/mixed_he_en_processed | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: input_length
dtype: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 15371503808
num_examples: 16000
- name: test
num_bytes: 3842860400
num_examples: 4000
download_size: 3049600172
dataset_size: 19214364208
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
bigscience-catalogue-data/lm_fr_wikihow_human_instructions | Invalid username or password. |
htdung167/vivos-preprocessed-v2 | ---
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: original_sentence
dtype: string
- name: preprocessed_sentence
dtype: string
- name: preprocessed_sentence_v2
dtype: string
splits:
- name: train
num_bytes: 1723109394.5
num_examples: 11660
- name: test
num_bytes: 86165311.0
num_examples: 760
download_size: 1773442225
dataset_size: 1809274705.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
vigneshgs7/Boundary_detection_Doc | ---
dataset_info:
features:
- name: name
dtype: string
- name: uuid
dtype: string
- name: status
dtype: string
- name: image
dtype: image
- name: label.annotations
list:
- name: id
dtype: int32
- name: category_id
dtype: int32
- name: label.segmentation_bitmap
dtype: image
splits:
- name: train
num_bytes: 4375553760.0
num_examples: 88
download_size: 286343850
dataset_size: 4375553760.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
YassineBenlaria/en_ar_transliterated | ---
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence_lat
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 299803083.0
num_examples: 500
download_size: 262146455
dataset_size: 299803083.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "en_ar_transliterated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsua/vroid-image-dataset-lite | ---
license: openrail++
task_categories:
- text-to-image
language:
- en
- ja
size_categories:
- 1K<n<10K
---
# VRoid Image Dataset Lite
This is a dataset to train text-to-image or other models without any copyright issue.
All materials used in this dataset are CC0 or properly licensed.
This dataset is also used to train [Mitsua Diffusion One](https://huggingface.co/Mitsua/mitsua-diffusion-one), which is a latent text-to-image diffusion model, whose VAE and U-Net are trained from scratch using only public domain/CC0 or copyright images with permission for use.
Various parameters such as camera angle, pose, skin color and facial expression were randomly set and the images were output.
## Dataset License
[Creative Open-Rail++-M License](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL)
This model is open access and available to all, with a CreativeML OpenRAIL++-M license further specifying rights and usage. The CreativeML OpenRAIL++-M License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL++-M to all your users (please read the license entirely and carefully) [Please read the full license here](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL)
## Materials used in this dataset and their licenses
### VRoid Models
- VRM models used in this dataset are all CC0.
- These models are made by VRoid Project
- [HairSample_Male](https://vroid.pixiv.help/hc/en-us/articles/4402614652569-Do-VRoid-Studio-s-sample-models-come-with-conditions-of-use-)
- [HairSample_Female](https://vroid.pixiv.help/hc/en-us/articles/4402614652569-Do-VRoid-Studio-s-sample-models-come-with-conditions-of-use-)
- [AvatarSample-D](https://vroid.pixiv.help/hc/en-us/articles/360012381793-AvatarSample-D)
- [AvatarSample-E](https://vroid.pixiv.help/hc/en-us/articles/360014900273-AvatarSample-E)
- [AvatarSample-F](https://vroid.pixiv.help/hc/en-us/articles/360014900113-AvatarSample-F)
- [AvatarSample-G](https://vroid.pixiv.help/hc/en-us/articles/360014900233-AvatarSample-G)
- [Sakurada Fumiriya](https://vroid.pixiv.help/hc/en-us/articles/360014788554-Sakurada-Fumiriya)
- [Sendagaya Shino](https://vroid.pixiv.help/hc/en-us/articles/360013482714-Sendagaya-Shino)
- These models are made by pastelskies
- [015](https://hub.vroid.com/characters/1636202188966335207/models/6893459099891579554)
- [009](https://hub.vroid.com/characters/2472286065213980612/models/9151142999439416702)
- [008](https://hub.vroid.com/characters/601931587119584437/models/3857812504036458003)
- These models are made by yomox9
- [Qi](https://hub.vroid.com/characters/2048759159111415425/models/6905433332368675090)
- These models are made by くつした
- [【CC0】オリジナルアバター「少女A」【Cluster想定】](https://hub.vroid.com/characters/5271108759876567944/models/9069514665234246177)
- These models are made by ろーてく
- [【CC0】オリジナルアバター「シャペル」【VRChat想定】](https://lowteq.booth.pm/items/1349366)
### Pose and motions
- Our original poses.
- Free edition pose subset in [Unity Humanoid AnimationClip - PoseCollection](https://necocoya.booth.pm/items/1634088) made by かんな久@ねここや様 (❗❗**NOT CC0**❗❗)
- We have obtained permission directly from the author for training or distributing the AI model.
- This dataset uses only a subset of the "Free edition (ポーズ詰め合わせ(無料版)in Japanese)", which is allowed to use for AI training.
- We have confirmed directly from the author that an exact equivalent license is not necesserily needed to distribute the trained model or to generate images.
- Therefore, to avoid harmful content generation, the Creative Open Rail++-M license is applied to this dataset, and an equivalent or more restrictive license must be applied to its derivatives.
### Shader
- MToon (MIT) with some modification by dev team.
### Other Textures for Skybox / Ground
- [Poly Haven](https://polyhaven.com/) (CC0)
- [ambientCG](https://ambientcg.com/) (CC0)
## Metadata Description
The final caption is not provided in this dataset, but you can create complete caption from metadata.
### Color Shifting
Color shift is used to create more diverse images. It is applied to skin/hair/eye/cloth/accesories independently.
- Parameter xyz = (H_Shift, S_Factor, V_Factor)
- New Color HSV = (H + H_Shift, S * S_Factor, V * V_Factor)
### Metadata Items
- vrm_name : VRoid model name
- clip_name : Pose Clip Number
- camera_profile
- facial_expression
- lighting
- lighting_color
- outline
- shade_toony
- skin_profile
- looking_label
- camera_position : 3D position in meter
- camera_rotation : Pitch/Yaw/Roll in degree
- camera_fov : in degree
- hair_color_shift : HSV color shift of hair
- eye_color_shift : HSV color shift of eye
- color_shift : HSV color shift of cloth and accesories
- ground_plane_material
- left_hand_sign
- right_hand_sign
- skybox
## Full Dataset
This is a subset of full dataset consisting of approx. 600k images.
Full dataset would be available upon request only for non-commercial research purposes.
You will need to provide 1 TB of online storage so that we could upload the data set or send us an empty 1 TB physical hard drive to our office located in Tokyo Japan.
Contact : info [at] elanmitsua.com
## Developed by
- Abstract Engine dev team
- Special Thanks to Mitsua Contributors
- VRoid is a trademark or registered trademark of Pixiv inc. in Japan and other regions. |
ibranze/araproje_hellaswag_en_f5 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149738.0
num_examples: 250
download_size: 0
dataset_size: 149738.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_f5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MLCommons/peoples_speech | ---
annotations_creators:
- crowdsourced
- machine-generated
language_creators:
- crowdsourced
- machine-generated
language:
- en
license:
- cc-by-2.0
- cc-by-2.5
- cc-by-3.0
- cc-by-4.0
- cc-by-sa-3.0
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1T<n
source_datasets:
- original
task_categories:
- automatic-speech-recognition
task_ids: []
pretty_name: People's Speech
tags:
- robust-speech-recognition
- noisy-speech-recognition
- speech-recognition
---
# Dataset Card for People's Speech
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://mlcommons.org/en/peoples-speech/
- **Repository:** https://github.com/mlcommons/peoples-speech
- **Paper:** https://arxiv.org/abs/2111.09344
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [datasets@mlcommons.org](mailto:datasets@mlcommons.org)
### Dataset Summary
The People's Speech Dataset is among the world's largest English speech recognition corpus today that is licensed for academic and commercial usage under CC-BY-SA and CC-BY 4.0. It includes 30,000+ hours of transcribed speech in English languages with a diverse set of speakers. This open dataset is large enough to train speech-to-text systems and crucially is available with a permissive license.
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
English
## Dataset Structure
### Data Instances
{
"id": "gov_DOT_uscourts_DOT_scotus_DOT_19-161/gov_DOT_uscourts_DOT_scotus_DOT_19-161_DOT_2020-03-02_DOT_mp3_00002.flac",
"audio": {
"path": "gov_DOT_uscourts_DOT_scotus_DOT_19-161/gov_DOT_uscourts_DOT_scotus_DOT_19-161_DOT_2020-03-02_DOT_mp3_00002.flac"
"array": array([-6.10351562e-05, ...]),
"sampling_rate": 16000
}
"duration_ms": 14490,
"text": "contends that the suspension clause requires a [...]"
}
### Data Fields
{
"id": datasets.Value("string"),
"audio": datasets.Audio(sampling_rate=16_000),
"duration_ms": datasets.Value("int32"),
"text": datasets.Value("string"),
}
### Data Splits
We provide the following configurations for the dataset: `cc-by-clean`, `cc-by-dirty`, `cc-by-sa-clean`, `cc-by-sa-dirty`, and `microset`. We don't provide splits for any of the configurations.
## Dataset Creation
### Curation Rationale
See our [paper](https://arxiv.org/abs/2111.09344).
### Source Data
#### Initial Data Collection and Normalization
Data was downloaded via the archive.org API. No data inference was done.
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
No manual annotation is done. We download only source audio with already existing transcripts.
#### Who are the annotators?
For the test and dev sets, we paid native American English speakers to do transcriptions. We do not know the identities of the transcriptionists for data in the training set. For the training set, we have noticed that some transcriptions are likely to be the output of automatic speech recognition systems.
### Personal and Sensitive Information
Several of our sources are legal and government proceedings, spoken histories, speeches, and so on. Given that these were intended as public documents and licensed as such, it is natural that the involved individuals are aware of this.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset could be used for speech synthesis. However, this requires careful cleaning of the dataset, as background noise is not tolerable for speech synthesis.
The dataset could be used for keyword spotting tasks as well. In particular, this is good use case for the non-English audio in the dataset.
Our sincere hope is that the large breadth of sources our dataset incorporates reduces existing quality of service issues today, like speech recognition system’s poor understanding of non-native English accents. We cannot think of any unfair treatment that come from using this dataset at this time.
### Discussion of Biases
Our data is downloaded from archive.org. As such, the data is biased towards whatever users decide to upload there.
Almost all of our data is American accented English.
### Other Known Limitations
As of version 1.0, a portion of data in the training, test, and dev sets is poorly aligned. Specifically, some words appear in the transcript, but not the audio, or some words appear in the audio, but not the transcript. We are working on it.
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
We provide CC-BY and CC-BY-SA subsets of the dataset.
### Citation Information
Please cite:
```
@article{DBLP:journals/corr/abs-2111-09344,
author = {Daniel Galvez and
Greg Diamos and
Juan Ciro and
Juan Felipe Cer{\'{o}}n and
Keith Achorn and
Anjali Gopi and
David Kanter and
Maximilian Lam and
Mark Mazumder and
Vijay Janapa Reddi},
title = {The People's Speech: {A} Large-Scale Diverse English Speech Recognition
Dataset for Commercial Usage},
journal = {CoRR},
volume = {abs/2111.09344},
year = {2021},
url = {https://arxiv.org/abs/2111.09344},
eprinttype = {arXiv},
eprint = {2111.09344},
timestamp = {Mon, 22 Nov 2021 16:44:07 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2111-09344.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` |
P-I2/RESULTS_LLAMA13b_SPV_MIA_Wikitext_eval | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
- name: p_theta_bar_x
dtype: float64
- name: p_phi_bar_x
dtype: float64
- name: change_in_p
dtype: float64
- name: is_member
dtype: bool
splits:
- name: train
num_bytes: 539391
num_examples: 1000
download_size: 369450
dataset_size: 539391
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TeeA/Pokemon-Captioning-Classification | ---
dataset_info:
features:
- name: image_file_path
dtype: string
- name: image
dtype: image
- name: labels
dtype:
class_label:
names:
'0': Porygon
'1': Goldeen
'2': Hitmonlee
'3': Hitmonchan
'4': Gloom
'5': Aerodactyl
'6': Mankey
'7': Seadra
'8': Gengar
'9': Venonat
'10': Articuno
'11': Seaking
'12': Dugtrio
'13': Machop
'14': Jynx
'15': Oddish
'16': Dodrio
'17': Dragonair
'18': Weedle
'19': Golduck
'20': Flareon
'21': Krabby
'22': Parasect
'23': Ninetales
'24': Nidoqueen
'25': Kabutops
'26': Drowzee
'27': Caterpie
'28': Jigglypuff
'29': Machamp
'30': Clefairy
'31': Kangaskhan
'32': Dragonite
'33': Weepinbell
'34': Fearow
'35': Bellsprout
'36': Grimer
'37': Nidorina
'38': Staryu
'39': Horsea
'40': Electabuzz
'41': Dratini
'42': Machoke
'43': Magnemite
'44': Squirtle
'45': Gyarados
'46': Pidgeot
'47': Bulbasaur
'48': Nidoking
'49': Golem
'50': Dewgong
'51': Moltres
'52': Zapdos
'53': Poliwrath
'54': Vulpix
'55': Beedrill
'56': Charmander
'57': Abra
'58': Zubat
'59': Golbat
'60': Wigglytuff
'61': Charizard
'62': Slowpoke
'63': Poliwag
'64': Tentacruel
'65': Rhyhorn
'66': Onix
'67': Butterfree
'68': Exeggcute
'69': Sandslash
'70': Pinsir
'71': Rattata
'72': Growlithe
'73': Haunter
'74': Pidgey
'75': Ditto
'76': Farfetchd
'77': Pikachu
'78': Raticate
'79': Wartortle
'80': Vaporeon
'81': Cloyster
'82': Hypno
'83': Arbok
'84': Metapod
'85': Tangela
'86': Kingler
'87': Exeggutor
'88': Kadabra
'89': Seel
'90': Voltorb
'91': Chansey
'92': Venomoth
'93': Ponyta
'94': Vileplume
'95': Koffing
'96': Blastoise
'97': Tentacool
'98': Lickitung
'99': Paras
'100': Clefable
'101': Cubone
'102': Marowak
'103': Nidorino
'104': Jolteon
'105': Muk
'106': Magikarp
'107': Slowbro
'108': Tauros
'109': Kabuto
'110': Spearow
'111': Sandshrew
'112': Eevee
'113': Kakuna
'114': Omastar
'115': Ekans
'116': Geodude
'117': Magmar
'118': Snorlax
'119': Meowth
'120': Pidgeotto
'121': Venusaur
'122': Persian
'123': Rhydon
'124': Starmie
'125': Charmeleon
'126': Lapras
'127': Alakazam
'128': Graveler
'129': Psyduck
'130': Rapidash
'131': Doduo
'132': Magneton
'133': Arcanine
'134': Electrode
'135': Omanyte
'136': Poliwhirl
'137': Mew
'138': Alolan Sandslash
'139': Mewtwo
'140': Weezing
'141': Gastly
'142': Victreebel
'143': Ivysaur
'144': MrMime
'145': Shellder
'146': Scyther
'147': Diglett
'148': Primeape
'149': Raichu
- name: caption
dtype: string
splits:
- name: train
num_bytes: 48180458.375
num_examples: 4869
- name: validation
num_bytes: 14034552.25
num_examples: 1390
- name: test
num_bytes: 7398994.0
num_examples: 732
download_size: 66724969
dataset_size: 69614004.625
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
bigscience/ester2_text | Invalid username or password. |
open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v3 | ---
pretty_name: Evaluation run of l3utterfly/mistral-7b-v0.1-layla-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [l3utterfly/mistral-7b-v0.1-layla-v3](https://huggingface.co/l3utterfly/mistral-7b-v0.1-layla-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-24T14:34:21.726636](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v3/blob/main/results_2024-02-24T14-34-21.726636.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6426216932029092,\n\
\ \"acc_stderr\": 0.03225359388063724,\n \"acc_norm\": 0.6454578910530434,\n\
\ \"acc_norm_stderr\": 0.0329013445844273,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4319642043238644,\n\
\ \"mc2_stderr\": 0.014312295538866832\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.01440982551840308,\n\
\ \"acc_norm\": 0.6177474402730375,\n \"acc_norm_stderr\": 0.014200454049979275\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.629555865365465,\n\
\ \"acc_stderr\": 0.004819367172685961,\n \"acc_norm\": 0.8340967934674368,\n\
\ \"acc_norm_stderr\": 0.003712334763856881\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105652,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105652\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848047,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848047\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973143,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973143\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247326,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247326\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36089385474860336,\n\
\ \"acc_stderr\": 0.016062290671110466,\n \"acc_norm\": 0.36089385474860336,\n\
\ \"acc_norm_stderr\": 0.016062290671110466\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n\
\ \"acc_stderr\": 0.01271540484127774,\n \"acc_norm\": 0.45371577574967403,\n\
\ \"acc_norm_stderr\": 0.01271540484127774\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.01882421951270621,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.01882421951270621\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4319642043238644,\n\
\ \"mc2_stderr\": 0.014312295538866832\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936654\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5602729340409401,\n \
\ \"acc_stderr\": 0.013672052434471576\n }\n}\n```"
repo_url: https://huggingface.co/l3utterfly/mistral-7b-v0.1-layla-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|arc:challenge|25_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|gsm8k|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hellaswag|10_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T14-34-21.726636.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-24T14-34-21.726636.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- '**/details_harness|winogrande|5_2024-02-24T14-34-21.726636.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-24T14-34-21.726636.parquet'
- config_name: results
data_files:
- split: 2024_02_24T14_34_21.726636
path:
- results_2024-02-24T14-34-21.726636.parquet
- split: latest
path:
- results_2024-02-24T14-34-21.726636.parquet
---
# Dataset Card for Evaluation run of l3utterfly/mistral-7b-v0.1-layla-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [l3utterfly/mistral-7b-v0.1-layla-v3](https://huggingface.co/l3utterfly/mistral-7b-v0.1-layla-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-24T14:34:21.726636](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v3/blob/main/results_2024-02-24T14-34-21.726636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6426216932029092,
"acc_stderr": 0.03225359388063724,
"acc_norm": 0.6454578910530434,
"acc_norm_stderr": 0.0329013445844273,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4319642043238644,
"mc2_stderr": 0.014312295538866832
},
"harness|arc:challenge|25": {
"acc": 0.5827645051194539,
"acc_stderr": 0.01440982551840308,
"acc_norm": 0.6177474402730375,
"acc_norm_stderr": 0.014200454049979275
},
"harness|hellaswag|10": {
"acc": 0.629555865365465,
"acc_stderr": 0.004819367172685961,
"acc_norm": 0.8340967934674368,
"acc_norm_stderr": 0.003712334763856881
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105652,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105652
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137602,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848047,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848047
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.034063153607115086,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.034063153607115086
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973143,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973143
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247326,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247326
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36089385474860336,
"acc_stderr": 0.016062290671110466,
"acc_norm": 0.36089385474860336,
"acc_norm_stderr": 0.016062290671110466
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.01271540484127774,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.01271540484127774
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.01882421951270621,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.01882421951270621
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4319642043238644,
"mc2_stderr": 0.014312295538866832
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936654
},
"harness|gsm8k|5": {
"acc": 0.5602729340409401,
"acc_stderr": 0.013672052434471576
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
oozora/nisargadatta_maharaj | ---
license: mit
---
|
tyzhu/squad_qa_context_v5_full_last_permute | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 4350151.0
num_examples: 2385
- name: validation
num_bytes: 570908
num_examples: 300
download_size: 0
dataset_size: 4921059.0
---
# Dataset Card for "squad_qa_context_v5_full_last_permute"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhan1993/ARB_transfer_matrix | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: expert_name
dtype: string
- name: task_eval_on
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 19490
num_examples: 440
download_size: 12510
dataset_size: 19490
---
# Dataset Card for "ARB_transfer_matrix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ehartford__Wizard-Vicuna-7B-Uncensored | ---
pretty_name: Evaluation run of ehartford/Wizard-Vicuna-7B-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/Wizard-Vicuna-7B-Uncensored](https://huggingface.co/ehartford/Wizard-Vicuna-7B-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__Wizard-Vicuna-7B-Uncensored\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T07:04:55.060331](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Wizard-Vicuna-7B-Uncensored/blob/main/results_2023-10-18T07-04-55.060331.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.18036912751677853,\n\
\ \"em_stderr\": 0.003937584689736024,\n \"f1\": 0.23801803691275183,\n\
\ \"f1_stderr\": 0.003988701736112215,\n \"acc\": 0.3838336904677134,\n\
\ \"acc_stderr\": 0.009164287920296908\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.18036912751677853,\n \"em_stderr\": 0.003937584689736024,\n\
\ \"f1\": 0.23801803691275183,\n \"f1_stderr\": 0.003988701736112215\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.045489006823351025,\n \
\ \"acc_stderr\": 0.005739657656722215\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871601\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/Wizard-Vicuna-7B-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T07_04_55.060331
path:
- '**/details_harness|drop|3_2023-10-18T07-04-55.060331.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T07-04-55.060331.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T07_04_55.060331
path:
- '**/details_harness|gsm8k|5_2023-10-18T07-04-55.060331.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T07-04-55.060331.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:04:57.410493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:04:57.410493.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:04:57.410493.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T07_04_55.060331
path:
- '**/details_harness|winogrande|5_2023-10-18T07-04-55.060331.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T07-04-55.060331.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_04_57.410493
path:
- results_2023-07-19T17:04:57.410493.parquet
- split: 2023_10_18T07_04_55.060331
path:
- results_2023-10-18T07-04-55.060331.parquet
- split: latest
path:
- results_2023-10-18T07-04-55.060331.parquet
---
# Dataset Card for Evaluation run of ehartford/Wizard-Vicuna-7B-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/Wizard-Vicuna-7B-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/Wizard-Vicuna-7B-Uncensored](https://huggingface.co/ehartford/Wizard-Vicuna-7B-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__Wizard-Vicuna-7B-Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T07:04:55.060331](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Wizard-Vicuna-7B-Uncensored/blob/main/results_2023-10-18T07-04-55.060331.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.18036912751677853,
"em_stderr": 0.003937584689736024,
"f1": 0.23801803691275183,
"f1_stderr": 0.003988701736112215,
"acc": 0.3838336904677134,
"acc_stderr": 0.009164287920296908
},
"harness|drop|3": {
"em": 0.18036912751677853,
"em_stderr": 0.003937584689736024,
"f1": 0.23801803691275183,
"f1_stderr": 0.003988701736112215
},
"harness|gsm8k|5": {
"acc": 0.045489006823351025,
"acc_stderr": 0.005739657656722215
},
"harness|winogrande|5": {
"acc": 0.7221783741120757,
"acc_stderr": 0.012588918183871601
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
susnato/csharp_PRs | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: pr_number
dtype: int64
- name: pr_title
dtype: string
- name: pr_description
dtype: string
- name: author
dtype: string
- name: date_created
dtype: timestamp[ns, tz=UTC]
- name: date_merged
dtype: timestamp[ns, tz=UTC]
- name: previous_commit
dtype: string
- name: pr_commit
dtype: string
- name: query
dtype: string
- name: filepath
dtype: string
- name: before_content
dtype: string
- name: after_content
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 117579951237
num_examples: 2074433
download_size: 58125931847
dataset_size: 117579951237
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ESGBERT/action_500 | ---
license: apache-2.0
---
|
DylanonWic/common_voice_10_1_th_clean_split_2_old | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: labels
sequence: int64
- name: input_values
sequence: float32
splits:
- name: train
num_bytes: 13050526359.943853
num_examples: 50594
download_size: 11872946207
dataset_size: 13050526359.943853
---
# Dataset Card for "common_voice_10_1_th_clean_split_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
umarzein/microlang | ---
license: mit
language:
- en
---
Microlang was designed to test text generation architectures
It consists of 16 tokens:
1. special (implicit):
<>
2. noun:
bob, tom, bike, speech
3. transitive verb:
take, use
4. intransitive verb:
talk, go
5. adjective:
good, active
6. adverb:
not
7. conjunction:
and, then, but
8. punctuation:
.
The tokenizer can be found on `umarzein/microlang-utils` and can be loaded this:
```python
import transformers
tokenizer = transformers.PreTrainedTokenizerFast.from_pretrained("umarzein/microlang-utils")
```
|
AdapterOcean/med_alpaca_standardized_cluster_19 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 112124603
num_examples: 11566
download_size: 32382147
dataset_size: 112124603
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_19"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LinkSoul/instruction_merge_set | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 13444870155
num_examples: 10077297
download_size: 3542585235
dataset_size: 13444870155
---
# Dataset Card for "instruction_merge_set"
## 本数据集由以下数据集构成:
| 数据(id in the merged set) | Hugging face 地址 | notes |
| --- | --- | --- |
| OIG (unified-任务名称) 15k | https://huggingface.co/datasets/laion/OIG | Open Instruction Generalist Dataset |
| Dolly databricks-dolly-15k | https://huggingface.co/datasets/databricks/databricks-dolly-15k | an open-source dataset of instruction-following records generated by thousands of Databricks employees in several of the behavioral categories |
| UltraChat | https://huggingface.co/datasets/stingning/ultrachat | multi-round dialogue data |
| Camel | https://huggingface.co/datasets/camel-ai/ai_society | 25K conversations between two gpt-3.5-turbo agents. |
| camel (同上) | https://github.com/camel-ai/camel | |
| ChatDoctor icliniq-15k HealthCareMagic-200k | https://github.com/Kent0n-Li/ChatDoctor | 200k real conversations between patients and doctors from HealthCareMagic.com 15k real conversations between patients and doctors from iciniq-10k |
| Dolly | https://github.com/databrickslabs/dolly | |
| GPT4ALL | https://github.com/nomic-ai/gpt4all | |
| GPT-4-LLM comparision_data_b alpaca_gpt4_data_zh comparision_data_a alpaca_gpt4_data 5k | https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM | English Instruction-Following Data generated by GPT-4 using Alpaca prompts for fine-tuning LLMs. Chinese Instruction-Following Data generated by GPT-4 using Chinese prompts translated from Alpaca by ChatGPT. Comparison Data ranked by GPT-4 to train reward models. Answers on Unnatural Instructions Data from GPT-4 to quantify the gap between GPT-4 and instruction-tuned models at scale. |
| GuanacoDataset guanaco_chat_all-utf8 guanaco_non_chat-utf8 paper_answers-utf8 general_ans-utf8 general_questions-utf8 paper_questions-utf8 30k | https://huggingface.co/datasets/JosephusCheung/GuanacoDataset | The dataset for the Guanaco model is designed to enhance the multilingual capabilities and address various linguistic tasks. It builds upon the 175 tasks from the Alpaca model by providing rewrites of seed tasks in different languages and adding new tasks specifically designed for English grammar analysis, natural language understanding, cross-lingual self-awareness, and explicit content recognition. The Paper/General-QA dataset is a collection of questions and answers constructed for AI-generated papers or general texts in English, Chinese, Japanese, and German. |
| HC3 ALL | https://huggingface.co/datasets/Hello-SimpleAI/HC3 | human-ChatGPT comparison datasets |
| instinwild instinwild_en instinwild_ch 5k | https://huggingface.co/datasets/QingyiSi/Alpaca-CoT/tree/main/instinwild | Instruction-Finetuning Dataset Collection (Alpaca-CoT) |
| Instruct-to-Code | https://huggingface.co/datasets/Graverman/Instruct-to-Code | |
| ShareGPT90K sg_90k_part2 sg_90k_part1 | https://huggingface.co/datasets/RyokoAI/ShareGPT52K | 90,000 conversations scraped via the ShareGPT API before it was shut down. These conversations include both user prompts and responses from OpenAI's ChatGPT. |
| UltraChat ultrachat_material_release_230412 ultrachat_release_230407 | https://github.com/thunlp/UltraChat | |
| wealth-alpaca-lora final_dataset_clean 4.3k | https://www.kaggle.com/code/gbhacker23/wealth-alpaca-lora | combination of Stanford's Alpaca (https://github.com/tatsu-lab/stanford_alpaca) and FiQA (https://sites.google.com/view/fiqa/) with another 1.3k pairs custom generated using GPT3.5, 有instruction |
| Alpaca alpaca_data 5k | https://github.com/tatsu-lab/stanford_alpaca | instruct-tuning |
| Baize alpaca_chat_data medical_chat_data quora_chat_data stack_overflow_chat_data | https://github.com/project-baize/baize-chatbot | instruction-following data we used for fine-tuning the Alpaca model. |
| botbots Reasoning flight_bookings medical_appointments travel_agency restaurants_mixed real_estate car_dealership home_maintenance, job_interview 'insurance_consultation': 16, 'hotels': 400, 'tech_support': 32, 'car_rentals': 32, 'pet_care': 48, 'restaurants': 200, 'legal_consultation': 16, 'event_tickets': 240, 'fitness_personal_training': 16, 'scientific_problems': 100 | https://github.com/radi-cho/botbots | A dataset consisting of dialogues between two instances of ChatGPT (gpt-3.5-turbo). The CLI commands and dialogue prompts themselves have been written by GPT-4. The dataset covers a wide range of contexts (questions and answers, arguing and reasoning, task-oriented dialogues) and downstream tasks (e.g., hotel reservations, medical advice). |
| ChatAlpaca chatalpaca_data_10k | https://github.com/cascip/ChatAlpaca | a chat dataset, multi-turn instruction-following conversations. |
| DERA train | https://github.com/curai/curai-research/tree/main/DERA | The following repository contains the open-ended question-answering version of MedQA. |
| GPTeacher Toolformer-dedupe-only-dataset roleplay-simple-deduped-roleplay-dataset gpt4-instruct-dedupe-only-dataset | https://github.com/teknium1/GPTeacher | A collection of modular datasets generated by GPT-4, General-Instruct - Roleplay-Instruct - Code-Instruct - and Toolformer |
| OpenAGI | https://github.com/agiresearch/OpenAGI | |
| presto | https://github.com/google-research-datasets/presto | A Multilingual Dataset for Parsing Realistic Task-Oriented Dialogs |
|
sheik21/matheus | ---
license: openrail
---
|
cm2435cm2435/rlhf-search-rerank | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: query
dtype: string
splits:
- name: train
num_bytes: 12535183
num_examples: 8552
download_size: 7191765
dataset_size: 12535183
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
voidful/EQG-RACE-PLUS | ---
dataset_info:
features:
- name: questions
list:
- name: answer
struct:
- name: answer_index
dtype: int64
- name: answer_text
dtype: string
- name: options
sequence: string
- name: question
dtype: string
- name: question_type
dtype: string
- name: article
dtype: string
- name: id
dtype: string
splits:
- name: train_all
num_bytes: 63952721
num_examples: 25137
- name: train_middle
num_bytes: 12480455
num_examples: 6409
- name: dev_high
num_bytes: 2790766
num_examples: 1021
- name: dev_middle
num_bytes: 712198
num_examples: 368
- name: test_middle
num_bytes: 714595
num_examples: 362
- name: train_high
num_bytes: 51472267
num_examples: 18728
- name: test_high
num_bytes: 2850894
num_examples: 1045
download_size: 33312158
dataset_size: 134973896
---
# Dataset Card for "QGG-RACE Dataset"
Table of Contents
- Dataset Description
- Dataset Summary
- Supported Tasks and Leaderboards
- Languages
- Dataset Structure
- Data Instances
- Data Fields
- Data Splits
- Dataset Creation
- Curation Rationale
- Source Data
- Annotations
- Personal and Sensitive Information
- Considerations for Using the Data
- Social Impact of Dataset
- Discussion of Biases
- Other Known Limitations
- Additional Information
- Dataset Curators
- Licensing Information
- Citation Information
- Contributions
## Dataset Description
- GitHub Repository: N/A
- Paper: N/A
- Leaderboard: N/A
- Point of Contact: N/A
## Dataset Summary
QGG-RACE Dataset is a subset of RACE, containing three types of questions: Factoid, Cloze, and Summarization.
Dataset Download: [GitHub Release](https://github.com/p208p2002/QGG-RACE-dataset/releases/download/v1.0/qgg-dataset.zip)
Data Statistics:
Types | Examples | Train | Dev | Test
------------- | ------------------------------------------ | ----- | ---- | ----
Cloze | Yingying is Wangwang's _ . | 43167 | 2405 | 2462
Factiod | What can Mimi do? | 18405 | 1030 | 944
Summarization | According to this passage we know that _ . | 3004 | 175 | 184
## Supported Tasks and Leaderboards
- Question Generation
- Reading Comprehension
- Text Summarization
## Languages
The dataset is in English.
## Dataset Structure
### Data Instances
An example data instance from the dataset is shown below:
```json
{
"answers": [
"D",
"A",
"B",
"C"
],
"options": [
[
"States",
"Doubts",
"Confirms",
"Removes"
],
[
"shows the kind of male birds females seek out.",
"indicates the wandering albatross is the most faithful.",
"is based on Professor Stutchbury's 20 years' research.",
"suggests that female birds select males near their home."
],
[
"young birds' quality depends on their feather.",
"some male birds care for others' young as their own.",
"female birds go to find males as soon as autumn comes.",
"female birds are responsible for feeding the hungry babies."
],
[
"A book about love-birds.",
"Birds' living habits and love life",
"The fact that birds don't love their mates forever.",
"The factors that influence birds to look for another mate."
]
],
"questions": [
"What does the underline word \"dispels\" mean?",
"The book The Private Lives of Birds _ .",
"According to the passage, we can infer that _ .",
"What is the passage mainly about?"
],
"article": "Birds are not as loyal to their partners as you might think ...",
"id": "high11327.txt",
"factoid_questions": [
"What does the underline word \"dispels\" mean?"
],
"cloze_questions": [
"The book The Private Lives of Birds _ ."
],
"summarization_questions": [
"According to the passage, we can infer that _ ."
]
}
```
## Data Fields
- id: Unique identifier for the example.
- article: The main text passage.
- questions: List of questions related to the passage.
- options: List of answer options for each question.
- answers: Indexes of the correct answers for each question.
- factoid_questions: List of factoid questions.
- cloze_questions: List of cloze questions.
- summarization_questions: List of summarization questions.
### Data Splits
- Train: Contains 65,576 examples.
- Dev: Contains 3,610 examples.
- Test: Contains 3,590 examples.
## Dataset Creation
### Curation Rationale
QGG-RACE dataset is created as a subset of RACE, focusing on three types of questions: Factoid, Cloze, and Summarization. This dataset is intended to facilitate research in question generation and reading comprehension.
### Source Data
#### Initial Data Collection and Normalization
QGG-RACE dataset is derived from RACE dataset.
#### Who are the source language producers?
The source language producers are the authors of the RACE dataset.
### Annotations
#### Annotation process
The dataset is annotated with questions and their corresponding answer options.
#### Who are the annotators?
The annotators are the authors of the RACE dataset.
### Personal and Sensitive Information
The dataset does not contain any personal or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
The QGG-RACE dataset can be used for research in question generation and reading comprehension, leading to improvements in these fields.
### Discussion of Biases
The dataset may inherit some biases from the RACE dataset as it is a subset of it.
### Other Known Limitations
No other known limitations.
## Additional Information
### Dataset Curators
The QGG-RACE dataset is curated by the authors of the QGG-RACE dataset GitHub repository.
### Licensing Information
The dataset is released under the [CC BY 4.0 License](https://creativecommons.org/licenses/by/4.0/).
### Citation Information
No citation information is available for the QGG-RACE dataset.
### Contributions
Thanks to @p208p2002 for creating the QGG-RACE dataset. |
CJWeiss/lcr | ---
dataset_info:
features:
- name: Long Text
dtype: string
- name: Summary
dtype: string
splits:
- name: train
num_bytes: 82108819
num_examples: 2918
- name: test
num_bytes: 18916443
num_examples: 584
- name: valid
num_bytes: 12955974
num_examples: 389
download_size: 56044522
dataset_size: 113981236
---
# Dataset Card for "lcr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
srobertsphd/mit-labnetwork | ---
license: mit
---
|
mask-distilled-one-sec-cv12/chunk_85 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1436427740
num_examples: 282095
download_size: 1464759870
dataset_size: 1436427740
---
# Dataset Card for "chunk_85"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bluesky333/chemical_language_understanding_benchmark | ---
license: cc-by-4.0
task_categories:
- text-classification
- token-classification
language:
- en
tags:
- chemistry
pretty_name: CLUB
size_categories:
- 10K<n<100K
---
## Table of Contents
- [Benchmark Summary](#benchmark-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
<p><h1>🧪🔋 Chemical Language Understanding Benchmark 🛢️🧴</h1></p>
<a name="benchmark-summary"></a>
Benchmark Summary
Chemistry Language Understanding Benchmark is published in ACL2023 industry track to facilitate NLP research in chemical industry [ACL2023 Paper Link Not Available Yet](link).
From our understanding, it is one of the first benchmark datasets with tasks for both patent and literature articles provided by the industrial organization.
All the datasets are annotated by professional chemists.
<a name="languages"></a>
Languages
The language of this benchmark is English.
<a name="dataset-structure"></a>
Data Structure
Benchmark has 4 datasets: 2 for text classification and 2 for token classification.
| Dataset | Task | # Examples | Avg. Token Length | # Classes / Entity Groups |
| ----- | ------ | ---------- | ------------ | ------------------------- |
| PETROCHEMICAL | Patent Area Classification | 2,775 | 448.19 | 7 |
| RHEOLOGY | Sentence Classification | 2,017 | 55.03 | 5 |
| CATALYST | Catalyst Entity Recognition | 4,663 | 42.07 | 5 |
| BATTERY | Battery Entity Recognition | 3,750 | 40.73 | 3 |
You can refer to the paper for detailed description of the datasets.
<a name="data-instances"></a>
Data Instances
Each example is a paragraph/setence of an academic paper or patent with annotations in a json format.
<a name="data-fields"></a>
Data Fields
The fields for the text classification task are:
1) 'id', a unique numbered identifier sequentially assigned.
2) 'sentence', the input text.
3) 'label', the class for the text.
The fields for the text classification task are:
1) 'id', a unique numbered identifier sequentially assigned.
2) 'tokens', the input text tokenized by BPE tokenizer.
3) 'ner_tags', the entity label for the tokens.
<a name="data-splits"></a>
Data Splits
The data is split into 80 (train) / 20 (development).
<a name="dataset-creation"></a>
Dataset Creation
<a name="curation-rationale"></a>
Curation Rationale
The dataset was created to provide a benchmark in chemical language model for researchers and developers.
<a name="source-data"></a>
Source Data
The dataset consists of open-access chemistry publications and patents annotated by professional chemists.
<a name="licensing-information"></a>
Licensing Information
The manual annotations created for CLUB are licensed under a [Creative Commons Attribution 4.0 International License (CC-BY-4.0)](https://creativecommons.org/licenses/by/4.0/).
<a name="citation-information"></a>
Citation Information
We will provide the citation information once ACL2023 industry track paper is published.
|
textminr/cmu-book-summaries | ---
language:
- en
license: cc
size_categories:
- 10K<n<100K
pretty_name: CMU Book Summary Dataset
tags:
- summary
- books
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: author
dtype: string
- name: pub_year
dtype: int64
- name: summary
dtype: string
splits:
- name: train
num_bytes: 42477672
num_examples: 16559
download_size: 26340403
dataset_size: 42477672
---
|
Xenova/MusicBenchEmbedded | ---
configs:
- config_name: default
data_files:
- split: train
path: "train.parquet"
- split: test
path: "test.parquet"
license: cc-by-sa-3.0
---
The [MusicBench](https://huggingface.co/datasets/amaai-lab/MusicBench) dataset embedded with [laion/larger_clap_music_and_speech](https://huggingface.co/laion/larger_clap_music_and_speech)
|
jainabh/LLama-2-FT | ---
dataset_info:
features:
- name: Text
dtype: string
splits:
- name: train
num_bytes: 9377
num_examples: 9
download_size: 9649
dataset_size: 9377
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "LLama-2-FT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
akozlova/RuFacts | ---
license: cc-by-4.0
task_categories:
- text-classification
language:
- ru
tags:
- fact-checking
size_categories:
- 1K<n<10K
---
# Dataset Card for RuFacts
## Dataset Description
RuFacts is a benchmark for internal fact-checking for the Russian language. The dataset contains tagged examples labeled consistent and inconsistent.
For inconsistent examples, ranges containing violations of facts in the source text and the generated text are also collected and presented on the [Kaggle competition page](https://www.kaggle.com/competitions/internal-fact-checking-for-the-russian-language).
Various data sources and approaches for data generation were used to create the training and test datasets for the fact-checking task. We consider the data on the sentence level and small texts. The average length of texts is 198 symbols, the minimum is 10 symbols, and the maximum is 3,402 symbols.
The final dataset was formed using three main approaches:
* Texts generated by a [paraphrase model](https://habr.com/ru/companies/sberdevices/articles/667106/)
* Translations of the [dataset for fact-checking](https://fever.ai/dataset/fever.html)
* Text augmentation
Translations and generated data were manually labeled via the crowd-sources platform Yandex.Toloka. We additionally manually annotate the augmented data for
the test set. The test set consists of examples from all three sources: 26% translations, 6% augmented data, and 68% generated paraphrases.
We require three criteria for the generated text to be factually consistent with the original:
1. facts are correct and not corrupted;
2. any additional facts in the generated texts are not included;
3. all the main facts are included in the generated text.
## Data Structure
### Data Fields
* `idx`: an integer
* `evidence`: a string containing the original text
* `claim`: a string containing the generated text by some genetative models
* `label`: an integer, either 0 or 1, indicating whether the facts are consistent (0) or inconsistent (1)
An example of `train`/`validation` looks as follows:
```
{'idx': 1,
'evidence': 'Суд в Англии рассмотрит дело советского диссидента Буковского',
'claim': 'Суд в Великобритании рассмотрит дело советского диссидента Буковского',
'label': 0}
```
An example of `test` looks as follows:
```
{'idx': 4,
'evidence': 'Google выплатит штраф в 200 млн долларов за сбор данных детей на YouTube.',
'claim': 'Google заплатит $200 млн за нарушения конфиденциальности детей на YouTube.',
'label': -1}
```
### Data Splits
| |train | validation | test|
|-----|------|------------|-----|
|rows |4677 | 1559 | 500 | |
freshpearYoon/v3_train_free_concat_37 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842538528
num_examples: 2500
download_size: 1870303899
dataset_size: 3842538528
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HydraLM/corpus_1_embedded_deduplicated | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
- name: embedding
sequence: float64
splits:
- name: train
num_bytes: 14843809239
num_examples: 1472917
download_size: 11121975605
dataset_size: 14843809239
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "corpus_1_embedded_deduplicated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allenai/cochrane_dense_max | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-MS^2
- extended|other-Cochrane
task_categories:
- summarization
- text2text-generation
paperswithcode_id: multi-document-summarization
pretty_name: MSLR Shared Task
---
This is a copy of the [Cochrane](https://huggingface.co/datasets/allenai/mslr2022) dataset, except the input source documents of its `validation` split have been replaced by a __dense__ retriever. The retrieval pipeline used:
- __query__: The `target` field of each example
- __corpus__: The union of all documents in the `train`, `validation` and `test` splits. A document is the concatenation of the `title` and `abstract`.
- __retriever__: [`facebook/contriever-msmarco`](https://huggingface.co/facebook/contriever-msmarco) via [PyTerrier](https://pyterrier.readthedocs.io/en/latest/) with default settings
- __top-k strategy__: `"max"`, i.e. the number of documents retrieved, `k`, is set as the maximum number of documents seen across examples in this dataset, in this case `k==25`
Retrieval results on the `train` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.7790 | 0.4487 | 0.1959 | 0.6268 |
Retrieval results on the `validation` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.7856 | 0.4424 | 0.1995 | 0.6433 |
Retrieval results on the `test` set:
N/A. Test set is blind so we do not have any queries. |
jiuyuan/course-recommendations | ---
license: afl-3.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 47265
num_examples: 73
download_size: 9199
dataset_size: 47265
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ssuengpp/reset1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8963883
num_examples: 6469
download_size: 4953550
dataset_size: 8963883
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thealphamike/nz-law | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- en
tags:
- legal
size_categories:
- 10K<n<100K
--- |
lcm3lo/luisasonza.7z | ---
license: openrail
---
|
bigscience-data/roots_indic-mr_wiktionary | ---
language: mr
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
|
carlosejimenez/seq2seq-cnndm | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
- name: orig_id
dtype: string
splits:
- name: train
num_bytes: 1264862028
num_examples: 287113
- name: validation
num_bytes: 57879460
num_examples: 13368
- name: test
num_bytes: 50052122
num_examples: 11490
download_size: 837821604
dataset_size: 1372793610
---
# Dataset Card for "seq2seq-cnndm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmayhem93/ChatCombined | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2530432677
num_examples: 1045936
download_size: 1272242079
dataset_size: 2530432677
task_categories:
- text-generation
- conversational
size_categories:
- 1M<n<10M
license: cc-by-nc-4.0
language:
- en
---
# Dataset Card for "ChatCombined"
Combined 5 AI Conversational datasets, added a <|SYSTEM|> prompt for each, and broke the conversation down with <|USER|> and <|ASSISTANT|> tags.
You will need to add these tokens to your tokenizer to fully utilize this dataset: <|SYSTEM|> <|USER|> <|ASSISTANT|>
Collated dataset links:
* [Alpaca GPT-4](https://huggingface.co/datasets/c-s-ale/alpaca-gpt4-data)
* [databricks-dolly-15k](https://github.com/databrickslabs/dolly)
* [Helpful and Harmless](https://huggingface.co/datasets/Dahoas/full-hh-rlhf)
* [Vicuna](https://huggingface.co/datasets/jeffwan/sharegpt_vicuna) - English subset only
* [GPT4ALL-J](https://huggingface.co/datasets/nomic-ai/gpt4all-j-prompt-generations)
## Citations
```bibtex
@misc{alpaca,
author = {Rohan Taori and Ishaan Gulrajani and Tianyi Zhang and Yann Dubois and Xuechen Li and Carlos Guestrin and Percy Liang and Tatsunori B. Hashimoto },
title = {Stanford Alpaca: An Instruction-following LLaMA model},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/tatsu-lab/stanford_alpaca}},
}
```
```bibtext
@misc{bai2022training,
title={Training a Helpful and Harmless Assistant with Reinforcement Learning from Human Feedback},
author={Yuntao Bai and Andy Jones and Kamal Ndousse and Amanda Askell and Anna Chen and Nova DasSarma and Dawn Drain and Stanislav Fort and Deep Ganguli and Tom Henighan and Nicholas Joseph and Saurav Kadavath and Jackson Kernion and Tom Conerly and Sheer El-Showk and Nelson Elhage and Zac Hatfield-Dodds and Danny Hernandez and Tristan Hume and Scott Johnston and Shauna Kravec and Liane Lovitt and Neel Nanda and Catherine Olsson and Dario Amodei and Tom Brown and Jack Clark and Sam McCandlish and Chris Olah and Ben Mann and Jared Kaplan},
year={2022},
eprint={2204.05862},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtext
@misc{vicuna2023,
title = {Vicuna: An Open-Source Chatbot Impressing GPT-4 with 90%* ChatGPT Quality},
url = {https://vicuna.lmsys.org},
author = {Chiang, Wei-Lin and Li, Zhuohan and Lin, Zi and Sheng, Ying and Wu, Zhanghao and Zhang, Hao and Zheng, Lianmin and Zhuang, Siyuan and Zhuang, Yonghao and Gonzalez, Joseph E. and Stoica, Ion and Xing, Eric P.},
month = {March},
year = {2023}
}
```
```bibtex
@misc{gpt4all,
author = {Yuvanesh Anand and Zach Nussbaum and Brandon Duderstadt and Benjamin Schmidt and Andriy Mulyar},
title = {GPT4All: Training an Assistant-style Chatbot with Large Scale Data Distillation from GPT-3.5-Turbo},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/nomic-ai/gpt4all}},
}
```
```bibtex
@article{peng2023gpt4llm,
title={Instruction Tuning with GPT-4},
author={Baolin Peng, Chunyuan Li, Pengcheng He, Michel Galley, Jianfeng Gao},
journal={arXiv preprint arXiv:2304.03277},
year={2023}
}
``` |
irds/mr-tydi_en | ---
pretty_name: '`mr-tydi/en`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `mr-tydi/en`
The `mr-tydi/en` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/en).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=32,907,100
- `queries` (i.e., topics); count=5,194
- `qrels`: (relevance assessments); count=5,360
This dataset is used by: [`mr-tydi_en_dev`](https://huggingface.co/datasets/irds/mr-tydi_en_dev), [`mr-tydi_en_test`](https://huggingface.co/datasets/irds/mr-tydi_en_test), [`mr-tydi_en_train`](https://huggingface.co/datasets/irds/mr-tydi_en_train)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/mr-tydi_en', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
queries = load_dataset('irds/mr-tydi_en', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mr-tydi_en', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Zhang2021MrTyDi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
@article{Clark2020TyDiQa,
title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki},
year={2020},
journal={Transactions of the Association for Computational Linguistics}
}
```
|
open-llm-leaderboard/details_TheBloke__WizardLM-70B-V1.0-GPTQ | ---
pretty_name: Evaluation run of TheBloke/WizardLM-70B-V1.0-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/WizardLM-70B-V1.0-GPTQ](https://huggingface.co/TheBloke/WizardLM-70B-V1.0-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__WizardLM-70B-V1.0-GPTQ_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-07T19:43:56.739522](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-70B-V1.0-GPTQ_public/blob/main/results_2023-11-07T19-43-56.739522.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.17470637583892618,\n\
\ \"em_stderr\": 0.0038886447854560428,\n \"f1\": 0.23969064597315412,\n\
\ \"f1_stderr\": 0.003917893809852688,\n \"acc\": 0.485548773226949,\n\
\ \"acc_stderr\": 0.011109928713164078\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.17470637583892618,\n \"em_stderr\": 0.0038886447854560428,\n\
\ \"f1\": 0.23969064597315412,\n \"f1_stderr\": 0.003917893809852688\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18498862774829417,\n \
\ \"acc_stderr\": 0.010695390472237899\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090259\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/WizardLM-70B-V1.0-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_07T19_43_56.739522
path:
- '**/details_harness|drop|3_2023-11-07T19-43-56.739522.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-07T19-43-56.739522.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_07T19_43_56.739522
path:
- '**/details_harness|gsm8k|5_2023-11-07T19-43-56.739522.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-07T19-43-56.739522.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_07T19_43_56.739522
path:
- '**/details_harness|winogrande|5_2023-11-07T19-43-56.739522.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-07T19-43-56.739522.parquet'
- config_name: results
data_files:
- split: 2023_11_07T19_43_56.739522
path:
- results_2023-11-07T19-43-56.739522.parquet
- split: latest
path:
- results_2023-11-07T19-43-56.739522.parquet
---
# Dataset Card for Evaluation run of TheBloke/WizardLM-70B-V1.0-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/WizardLM-70B-V1.0-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-70B-V1.0-GPTQ](https://huggingface.co/TheBloke/WizardLM-70B-V1.0-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__WizardLM-70B-V1.0-GPTQ_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-07T19:43:56.739522](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-70B-V1.0-GPTQ_public/blob/main/results_2023-11-07T19-43-56.739522.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.17470637583892618,
"em_stderr": 0.0038886447854560428,
"f1": 0.23969064597315412,
"f1_stderr": 0.003917893809852688,
"acc": 0.485548773226949,
"acc_stderr": 0.011109928713164078
},
"harness|drop|3": {
"em": 0.17470637583892618,
"em_stderr": 0.0038886447854560428,
"f1": 0.23969064597315412,
"f1_stderr": 0.003917893809852688
},
"harness|gsm8k|5": {
"acc": 0.18498862774829417,
"acc_stderr": 0.010695390472237899
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090259
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ismailiismail/paraphrasing_french_5000 | ---
dataset_info:
features:
- name: phrase
dtype: string
- name: paraphrase
dtype: string
splits:
- name: train
num_bytes: 1240685
num_examples: 4972
download_size: 499325
dataset_size: 1240685
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "paraphrasing_french_5000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-conceptual_physics-neg-answer | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_answer
dtype: string
splits:
- name: test
num_bytes: 46275
num_examples: 235
download_size: 29112
dataset_size: 46275
---
# Dataset Card for "mmlu-conceptual_physics-neg-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/dumb_decimal | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 225.0
num_examples: 9
- name: test
num_bytes: 25
num_examples: 1
download_size: 3294
dataset_size: 250.0
---
# Dataset Card for "dumb_decimal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DWorkApp/DWorkApp | ---
license: mit
---
|
cointegrated/taiga_stripped_proza | ---
dataset_info:
features:
- name: text
dtype: string
- name: file
dtype: string
splits:
- name: train
num_bytes: 41147451264
num_examples: 1732589
download_size: 21158723805
dataset_size: 41147451264
license: cc-by-sa-3.0
language:
- ru
tags:
- taiga
- tayga
size_categories:
- 1M<n<10M
task_categories:
- text-generation
- fill-mask
---
# Dataset Card for "taiga_stripped_proza"
This is a subset of the Taiga corpus (https://tatianashavrina.github.io/taiga_site), derived from the `proza` source (a.k.a. "Fiction").
The dataset consists of plain texts, without morphological and syntactic annotation or metainformation. Apart from stripping the annotations, the texts were not modified.
For more details and analysis, and for the texts with annotation or metadata, please refer to website of the corpus.
Other subsets of Taiga: [stihi](https://huggingface.co/datasets/cointegrated/taiga_stripped_stihi) (poetry)
and [other sources](https://huggingface.co/datasets/cointegrated/taiga_stripped_rest) (news, subtitles, and social media).
License: [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/). |
Sayan0629/RWKV_final | ---
license: afl-3.0
---
|
cheafdevo56/InfluentialTriplets1Percent | ---
dataset_info:
features:
- name: query
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: pos
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: neg
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: score
dtype: int64
- name: title
dtype: string
splits:
- name: train
num_bytes: 173356305.3
num_examples: 45000
- name: validation
num_bytes: 19261811.7
num_examples: 5000
download_size: 115821068
dataset_size: 192618117.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Prabhjot410/Ecommerce_FAQ_chatbot_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 19820
num_examples: 158
download_size: 8754
dataset_size: 19820
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Ecommerce_FAQ_chatbot_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HeTree/MevakerConc | ---
license: apache-2.0
language:
- he
---
## MevakerConc
Conclusion extraction dataset.
Contains the context of the audit, the offsets of conclusions as marked by the auditors and the conclusions text contained within the offsets.
### Citing
If you use MevakerConc in your research, please cite [Mevaker: Conclusion Extraction and Allocation Resources for the Hebrew Language](https://arxiv.org/abs/2403.09719).
```
@article{shalumov2024mevaker,
title={Mevaker: Conclusion Extraction and Allocation Resources for the Hebrew Language},
author={Vitaly Shalumov and Harel Haskey and Yuval Solaz},
year={2024},
eprint={2403.09719},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
vaibhav1/gpt_rationales_for_mongolian_news | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': 'FALSE'
'1': MISLEADING
'2': MISSING-CONTEXT
'3': 'TRUE'
- name: text
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 17478
num_examples: 41
- name: test
num_bytes: 12404
num_examples: 28
download_size: 26010
dataset_size: 29882
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
wics/strategy-qa | ---
license: other
---
|
joey234/mmlu-college_biology-neg-answer | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_answer
dtype: string
splits:
- name: test
num_bytes: 56357
num_examples: 144
download_size: 37571
dataset_size: 56357
---
# Dataset Card for "mmlu-college_biology-neg-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-6b | ---
pretty_name: Evaluation run of TehVenom/PPO_Shygmalion-6b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TehVenom/PPO_Shygmalion-6b](https://huggingface.co/TehVenom/PPO_Shygmalion-6b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-6b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T15:36:21.377959](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-6b/blob/main/results_2023-10-18T15-36-21.377959.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.0003314581465219229,\n \"f1\": 0.051307676174496844,\n\
\ \"f1_stderr\": 0.001242463870785362,\n \"acc\": 0.3358539181760356,\n\
\ \"acc_stderr\": 0.008527692652879759\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219229,\n\
\ \"f1\": 0.051307676174496844,\n \"f1_stderr\": 0.001242463870785362\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01819560272934041,\n \
\ \"acc_stderr\": 0.003681611894073874\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6535122336227308,\n \"acc_stderr\": 0.013373773411685644\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TehVenom/PPO_Shygmalion-6b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T15_36_21.377959
path:
- '**/details_harness|drop|3_2023-10-18T15-36-21.377959.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T15-36-21.377959.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T15_36_21.377959
path:
- '**/details_harness|gsm8k|5_2023-10-18T15-36-21.377959.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T15-36-21.377959.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:01:34.013898.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:01:34.013898.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:01:34.013898.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T15_36_21.377959
path:
- '**/details_harness|winogrande|5_2023-10-18T15-36-21.377959.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T15-36-21.377959.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_01_34.013898
path:
- results_2023-07-19T16:01:34.013898.parquet
- split: 2023_10_18T15_36_21.377959
path:
- results_2023-10-18T15-36-21.377959.parquet
- split: latest
path:
- results_2023-10-18T15-36-21.377959.parquet
---
# Dataset Card for Evaluation run of TehVenom/PPO_Shygmalion-6b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TehVenom/PPO_Shygmalion-6b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TehVenom/PPO_Shygmalion-6b](https://huggingface.co/TehVenom/PPO_Shygmalion-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-6b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T15:36:21.377959](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-6b/blob/main/results_2023-10-18T15-36-21.377959.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219229,
"f1": 0.051307676174496844,
"f1_stderr": 0.001242463870785362,
"acc": 0.3358539181760356,
"acc_stderr": 0.008527692652879759
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219229,
"f1": 0.051307676174496844,
"f1_stderr": 0.001242463870785362
},
"harness|gsm8k|5": {
"acc": 0.01819560272934041,
"acc_stderr": 0.003681611894073874
},
"harness|winogrande|5": {
"acc": 0.6535122336227308,
"acc_stderr": 0.013373773411685644
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-5000 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 6182431715
num_examples: 1000
download_size: 1230803225
dataset_size: 6182431715
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
laion/laion2B-multi-md5 | Invalid username or password. |
Jinendra/food_knowlodge_database | ---
license: mit
---
|
nateraw/espeni-2 | ---
zenodo_id: '6606485'
license:
- unknown
---
# Dataset Card for Electrical half hourly raw and cleaned datasets for Great Britain from 2008-11-05
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://zenodo.org/record/6606485
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
<p><strong>A journal paper published in Energy Strategy Reviews details the method to create the data.</strong></p>
<p><strong>https://www.sciencedirect.com/science/article/pii/S2211467X21001280</strong></p>
<p> </p>
<p>2021-09-09: Version 6.0.0 was created. Now includes data for the North Sea Link (NSL) interconnector from Great Britain to Norway (https://www.northsealink.com). The previous version (5.0.4) should not be used - as there was an error with interconnector data having a static value over the summer 2021.</p>
<p> </p>
<p>2021-05-05: Version 5.0.0 was created. Datetimes now in ISO 8601 format (with capital letter 'T' between the date and time) rather than previously with a space (to RFC 3339 format) and with an offset to identify both UTC and localtime. MW values now all saved as integers rather than floats. Elexon data as always from www.elexonportal.co.uk/fuelhh, National Grid data from https://data.nationalgrideso.com/demand/historic-demand-data Raw data now added again for comparison of pre and post cleaning - to allow for training of additional cleaning methods. If using Microsoft Excel, the T between the date and time can be removed using the =SUBSTITUTE() command - and substitute "T" for a space " "</p>
<p>_____________________________________________________________________________________________________</p>
<p>2021-03-02: Version 4.0.0 was created. Due to a new interconnecter (IFA2 - https://en.wikipedia.org/wiki/IFA-2) being commissioned in Q1 2021, there is an additional column with data from National Grid - this is called 'POWER_NGEM_IFA2_FLOW_MW' in the espeni dataset. In addition, National Grid has dropped the column name 'FRENCH_FLOW' that used to provide the value for the column 'POWER_NGEM_FRENCH_FLOW_MW' in previous espeni versions. However, this has been changed to 'IFA_FLOW' in National Grid's original data, which is now called 'POWER_NGEM_IFA_FLOW_MW' in the espeni dataset. Lastly, the IO14 columns have all been dropped by National Grid - and potentially unlikely to appear again in future.</p>
<p>2020-12-02: Version 3.0.0 was created. There was a problem with earlier versions local time format - where the +01:00 value was not carried through into the data properly. Now addressed - therefore - local time now has the format e.g. 2020-03-31 20:00:00+01:00 when in British Summer Time.</p>
<p>2020-10-03: Version 2.0.0 was created as it looks like National Grid has had a significant change to the methodology underpinning the embedded wind calculations. The wind profile seems similar to previous values, but with an increasing value in comparison to the value published in earlier the greater the embedded value is. The 'new' values are from https://data.nationalgrideso.com/demand/daily-demand-update from 2013.</p>
<p>Previously: raw and cleaned datasets for Great Britain's publicly available electrical data from Elexon (www.elexonportal.co.uk) and National Grid (https://demandforecast.nationalgrid.com/efs_demand_forecast/faces/DataExplorer). Updated versions with more recent data will be uploaded with a differing version number and doi</p>
<p>All data is released in accordance with Elexon's disclaimer and reservation of rights.</p>
<p>https://www.elexon.co.uk/using-this-website/disclaimer-and-reservation-of-rights/</p>
<p>This disclaimer is also felt to cover the data from National Grid, and the parsed data from the Energy Informatics Group at the University of Birmingham.</p>
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The class labels in the dataset are in English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was shared by Grant Wilson, Noah Godfrey
### Licensing Information
The license for this dataset is https://creativecommons.org/licenses/by-nc/4.0/legalcode
### Citation Information
```bibtex
@dataset{grant_wilson_2022_6606485,
author = {Grant Wilson and
Noah Godfrey},
title = {{Electrical half hourly raw and cleaned datasets
for Great Britain from 2008-11-05}},
month = jun,
year = 2022,
note = {{Grant funding as part of Research Councils (UK)
EP/L024756/1 - UK Energy Research Centre research
programme Phase 3 Grant funding as part of
Research Councils (UK) EP/V012053/1 - The Active
Building Centre Research Programme (ABC RP)}},
publisher = {Zenodo},
version = {6.0.9},
doi = {10.5281/zenodo.6606485},
url = {https://doi.org/10.5281/zenodo.6606485}
}
```
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
imdatta0/ultrachat_5k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 29446452.84198879
num_examples: 5000
- name: test
num_bytes: 1472322.6420994396
num_examples: 250
download_size: 15487026
dataset_size: 30918775.48408823
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
vitaliy-sharandin/climate-global-temp-anomaly | ---
dataset_info:
features:
- name: Entity
dtype: string
- name: Code
dtype: float64
- name: Global average temperature anomaly relative to 1961-1990
dtype: float64
- name: Upper bound (95% confidence interval) of the annual temperature anomaly
dtype: float64
- name: Lower bound (95% confidence interval) of the annual temperature anomaly
dtype: float64
- name: dt
dtype: timestamp[ns]
splits:
- name: train
num_bytes: 30513
num_examples: 519
download_size: 20408
dataset_size: 30513
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "climate-global-temp-anomaly"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jerimee/autotrain-data-dontknowwhatImdoing | ---
language:
- en
task_categories:
- text-classification
---
# AutoTrain Dataset for project: dontknowwhatImdoing
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project dontknowwhatImdoing.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "Gaston",
"target": 1
},
{
"text": "Churchundyr",
"target": 0
}
]
```
Note that, sadly, it flipped the boolean, using 1 for mundane and 0 for goblin.
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(num_classes=2, names=['Goblin', 'Mundane'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 965 |
| valid | 242 |
|
PoojaBhati/recipe_dataset | ---
license: mit
---
|
bneel-work/Ubuntu-Kpis-Prompt | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 989541
num_examples: 5030
download_size: 142307
dataset_size: 989541
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dmayhem93/summarization-sft-heirarchical-split1 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: 125M
dtype: string
- name: 1B
dtype: string
- name: 6B
dtype: string
- name: 20B
dtype: string
splits:
- name: train
num_bytes: 121267953
num_examples: 47241
- name: test
num_bytes: 217854405
num_examples: 83632
- name: valid
num_bytes: 86573178
num_examples: 33088
download_size: 124221198
dataset_size: 425695536
---
# Dataset Card for "summarization-sft-heirarchical-split1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Sao10K__Zephyrus-L1-33B | ---
pretty_name: Evaluation run of Sao10K/Zephyrus-L1-33B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Zephyrus-L1-33B](https://huggingface.co/Sao10K/Zephyrus-L1-33B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Zephyrus-L1-33B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T22:58:44.962753](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Zephyrus-L1-33B/blob/main/results_2023-10-26T22-58-44.962753.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1826761744966443,\n\
\ \"em_stderr\": 0.003957106009101816,\n \"f1\": 0.2689282718120811,\n\
\ \"f1_stderr\": 0.004022541310882054,\n \"acc\": 0.5188394618630148,\n\
\ \"acc_stderr\": 0.011447189197576924\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.1826761744966443,\n \"em_stderr\": 0.003957106009101816,\n\
\ \"f1\": 0.2689282718120811,\n \"f1_stderr\": 0.004022541310882054\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2357846853677028,\n \
\ \"acc_stderr\": 0.0116925156506668\n },\n \"harness|winogrande|5\": {\n\
\ \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.011201862744487047\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Zephyrus-L1-33B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|arc:challenge|25_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T22_58_44.962753
path:
- '**/details_harness|drop|3_2023-10-26T22-58-44.962753.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T22-58-44.962753.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T22_58_44.962753
path:
- '**/details_harness|gsm8k|5_2023-10-26T22-58-44.962753.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T22-58-44.962753.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hellaswag|10_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T22_58_44.962753
path:
- '**/details_harness|winogrande|5_2023-10-26T22-58-44.962753.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T22-58-44.962753.parquet'
- config_name: results
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- results_2023-10-04T01-26-11.799613.parquet
- split: 2023_10_26T22_58_44.962753
path:
- results_2023-10-26T22-58-44.962753.parquet
- split: latest
path:
- results_2023-10-26T22-58-44.962753.parquet
---
# Dataset Card for Evaluation run of Sao10K/Zephyrus-L1-33B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Zephyrus-L1-33B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Zephyrus-L1-33B](https://huggingface.co/Sao10K/Zephyrus-L1-33B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Zephyrus-L1-33B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T22:58:44.962753](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Zephyrus-L1-33B/blob/main/results_2023-10-26T22-58-44.962753.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1826761744966443,
"em_stderr": 0.003957106009101816,
"f1": 0.2689282718120811,
"f1_stderr": 0.004022541310882054,
"acc": 0.5188394618630148,
"acc_stderr": 0.011447189197576924
},
"harness|drop|3": {
"em": 0.1826761744966443,
"em_stderr": 0.003957106009101816,
"f1": 0.2689282718120811,
"f1_stderr": 0.004022541310882054
},
"harness|gsm8k|5": {
"acc": 0.2357846853677028,
"acc_stderr": 0.0116925156506668
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.011201862744487047
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AdapterOcean/med_alpaca_standardized_cluster_22_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 5812068
num_examples: 9177
download_size: 2878625
dataset_size: 5812068
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_22_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SIVANNIM/rvcmodels | ---
license: other
---
|
tarotstart/AI4S_Cup_gh_LLM | ---
language:
- en
--- |
l-lt/LaSOT-ext | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
viewer: false
---
# Dataset Card for LaSOT-ext
## Dataset Description
- **Homepage:** [LaSOT homepage](http://vision.cs.stonybrook.edu/~lasot/)
- **Paper:** [LaSOT: A High-quality Large-scale Single Object Tracking Benchmark](https://arxiv.org/abs/2009.03465)
- **Point of Contact:** [Heng Fan](heng.fan@unt.edu)
### Dataset Summary
**La**rge-scale **S**ingle **O**bject **T**racking (**LaSOT**) aims to provide a dedicated platform for training data-hungry deep trackers as well as assessing long-term tracking performance.
This repository contains the new subset introduced in the journal version of LaSOT (commonly called **LaSOT<sub>ext</sub>**), published in IJCV ([LaSOT: A High-quality Large-scale Single Object Tracking Benchmark](https://arxiv.org/abs/2009.03465)).
For the training/testing splits of LaSOT (conference version), please refer to this [repo](https://huggingface.co/datasets/l-lt/LaSOT).
## Download
You can download the whole dataset via the ```huggingface_hub``` library ([guide](https://huggingface.co/docs/huggingface_hub/guides/download)):
```python
from huggingface_hub import snapshot_download
snapshot_download(repo_id='l-lt/LaSOT-ext', repo_type='dataset', local_dir='/path/to/download')
```
Alternatively, download the videos of a specific category manually from this [page](https://huggingface.co/datasets/l-lt/LaSOT-ext/tree/main).
LaSOT<sub>ext</sub> can also be downloaded from:
* As a single zip file: [OneDrive](https://1drv.ms/u/s!Akt_zO4y_u6DgoQrvo5h48AC15l67A?e=Zo6PWx) or [Homepage server](http://vision.cs.stonybrook.edu/~lasot/data/LaSOT_extension_subset.zip)
* As one zip file per category: [OneDrive](https://1drv.ms/f/s!Akt_zO4y_u6DgoQZH_aGsNh2f6x6Dg?e=sldyAx)
### Setup
Unzip all zip files and organize the paths as follows:
```
├── atv
│ ├── atv-1
│ ...
├── badminton
...
```
## Evaluation Metrics and Toolkit
See the [homepage](http://vision.cs.stonybrook.edu/~lasot/results.html) for more information.
|
philipphager/baidu-ultr_baidu-mlm-ctr | ---
license: cc-by-nc-4.0
viewer: false
---
# Baidu ULTR Dataset - Baidu BERT-12l-12h
Query-document vectors and clicks for a subset of the [Baidu Unbiased Learning to Rank
dataset](https://arxiv.org/abs/2207.03051).
This dataset uses the BERT cross-encoder with 12 layers from Baidu released in the [official starter-kit](https://github.com/ChuXiaokai/baidu_ultr_dataset/) to compute query-document vectors (768 dims).
## Setup
1. Install huggingface [datasets](https://huggingface.co/docs/datasets/installation)
2. Install [pandas](https://github.com/pandas-dev/pandas) and [pyarrow](https://arrow.apache.org/docs/python/index.html): `pip install pandas pyarrow`
3. Optionally, you might need to install a [pyarrow-hotfix](https://github.com/pitrou/pyarrow-hotfix) if you cannot install `pyarrow >= 14.0.1`
4. You can now use the dataset as described below.
## Load train / test click dataset:
```Python
from datasets import load_dataset
dataset = load_dataset(
"philipphager/baidu-ultr_baidu-mlm-ctr",
name="clicks",
split="train", # ["train", "test"]
cache_dir="~/.cache/huggingface",
)
dataset.set_format("torch") # [None, "numpy", "torch", "tensorflow", "pandas", "arrow"]
```
## Load expert annotations:
```Python
from datasets import load_dataset
dataset = load_dataset(
"philipphager/baidu-ultr_baidu-mlm-ctr",
name="annotations",
split="test",
cache_dir="~/.cache/huggingface",
)
dataset.set_format("torch") # [None, "numpy", "torch", "tensorflow", "pandas", "arrow"]
```
## Available features
Each row of the click / annotation dataset contains the following attributes. Use a custom `collate_fn` to select specific features (see below):
### Click dataset
| name | dtype | description |
|------------------------------|----------------|-------------|
| query_id | string | Baidu query_id |
| query_md5 | string | MD5 hash of query text |
| query | List[int32] | List of query tokens |
| query_length | int32 | Number of query tokens |
| n | int32 | Number of documents for current query, useful for padding |
| url_md5 | List[string] | MD5 hash of document URL, most reliable document identifier |
| text_md5 | List[string] | MD5 hash of document title and abstract |
| title | List[List[int32]] | List of tokens for document titles |
| abstract | List[List[int32]] | List of tokens for document abstracts |
| query_document_embedding | Tensor[Tensor[float16]]| BERT CLS token |
| click | Tensor[int32] | Click / no click on a document |
| position | Tensor[int32] | Position in ranking (does not always match original item position) |
| media_type | Tensor[int32] | Document type (label encoding recommended as IDs do not occupy a continuous integer range) |
| displayed_time | Tensor[float32]| Seconds a document was displayed on the screen |
| serp_height | Tensor[int32] | Pixel height of a document on the screen |
| slipoff_count_after_click | Tensor[int32] | Number of times a document was scrolled off the screen after previously clicking on it |
| bm25 | Tensor[float32] | BM25 score for documents |
| bm25_title | Tensor[float32] | BM25 score for document titles |
| bm25_abstract | Tensor[float32] | BM25 score for document abstracts |
| tf_idf | Tensor[float32] | TF-IDF score for documents |
| tf | Tensor[float32] | Term frequency for documents |
| idf | Tensor[float32] | Inverse document frequency for documents |
| ql_jelinek_mercer_short | Tensor[float32] | Query likelihood score for documents using Jelinek-Mercer smoothing (alpha = 0.1) |
| ql_jelinek_mercer_long | Tensor[float32] | Query likelihood score for documents using Jelinek-Mercer smoothing (alpha = 0.7) |
| ql_dirichlet | Tensor[float32] | Query likelihood score for documents using Dirichlet smoothing (lambda = 128) |
| document_length | Tensor[int32] | Length of documents |
| title_length | Tensor[int32] | Length of document titles |
| abstract_length | Tensor[int32] | Length of document abstracts |
### Expert annotation dataset
| name | dtype | description |
|------------------------------|----------------|-------------|
| query_id | string | Baidu query_id |
| query_md5 | string | MD5 hash of query text |
| query | List[int32] | List of query tokens |
| query_length | int32 | Number of query tokens |
| frequency_bucket | int32 | Monthly frequency of query (bucket) from 0 (high frequency) to 9 (low frequency) |
| n | int32 | Number of documents for current query, useful for padding |
| url_md5 | List[string] | MD5 hash of document URL, most reliable document identifier |
| text_md5 | List[string] | MD5 hash of document title and abstract |
| title | List[List[int32]] | List of tokens for document titles |
| abstract | List[List[int32]] | List of tokens for document abstracts |
| query_document_embedding | Tensor[Tensor[float16]] | BERT CLS token |
| label | Tensor[int32] | Relevance judgments on a scale from 0 (bad) to 4 (excellent) |
| bm25 | Tensor[float32] | BM25 score for documents |
| bm25_title | Tensor[float32] | BM25 score for document titles |
| bm25_abstract | Tensor[float32] | BM25 score for document abstracts |
| tf_idf | Tensor[float32] | TF-IDF score for documents |
| tf | Tensor[float32] | Term frequency for documents |
| idf | Tensor[float32] | Inverse document frequency for documents |
| ql_jelinek_mercer_short | Tensor[float32] | Query likelihood score for documents using Jelinek-Mercer smoothing (alpha = 0.1) |
| ql_jelinek_mercer_long | Tensor[float32] | Query likelihood score for documents using Jelinek-Mercer smoothing (alpha = 0.7) |
| ql_dirichlet | Tensor[float32] | Query likelihood score for documents using Dirichlet smoothing (lambda = 128) |
| document_length | Tensor[int32] | Length of documents |
| title_length | Tensor[int32] | Length of document titles |
| abstract_length | Tensor[int32] | Length of document abstracts |
## Example PyTorch collate function
Each sample in the dataset is a single query with multiple documents.
The following example demonstrates how to create a batch containing multiple queries with varying numbers of documents by applying padding:
```Python
import torch
from typing import List
from collections import defaultdict
from torch.nn.utils.rnn import pad_sequence
from torch.utils.data import DataLoader
def collate_clicks(samples: List):
batch = defaultdict(lambda: [])
for sample in samples:
batch["query_document_embedding"].append(sample["query_document_embedding"])
batch["position"].append(sample["position"])
batch["click"].append(sample["click"])
batch["n"].append(sample["n"])
return {
"query_document_embedding": pad_sequence(
batch["query_document_embedding"], batch_first=True
),
"position": pad_sequence(batch["position"], batch_first=True),
"click": pad_sequence(batch["click"], batch_first=True),
"n": torch.tensor(batch["n"]),
}
loader = DataLoader(dataset, collate_fn=collate_clicks, batch_size=16)
```
|
open-llm-leaderboard/details_fblgit__UNA-dolphin-2.6-mistral-7b-dpo-laser | ---
pretty_name: Evaluation run of fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser](https://huggingface.co/fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fblgit__UNA-dolphin-2.6-mistral-7b-dpo-laser\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T16:50:26.517326](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__UNA-dolphin-2.6-mistral-7b-dpo-laser/blob/main/results_2024-01-13T16-50-26.517326.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6337373508284295,\n\
\ \"acc_stderr\": 0.032383816315627395,\n \"acc_norm\": 0.6380892308683148,\n\
\ \"acc_norm_stderr\": 0.03302967908991101,\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6414925770737219,\n\
\ \"mc2_stderr\": 0.015103448074375492\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6373720136518771,\n \"acc_stderr\": 0.014049106564955009,\n\
\ \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.013724978465537304\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6719776936865166,\n\
\ \"acc_stderr\": 0.00468533484403866,\n \"acc_norm\": 0.8630750846444931,\n\
\ \"acc_norm_stderr\": 0.0034306550069275825\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.037827289808654685,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.037827289808654685\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887048,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887048\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010344,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010344\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437413,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437413\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389104,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389104\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287414,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287414\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990925,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990925\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834829,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834829\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134135,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134135\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n\
\ \"acc_stderr\": 0.01612554382355295,\n \"acc_norm\": 0.3675977653631285,\n\
\ \"acc_norm_stderr\": 0.01612554382355295\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n\
\ \"acc_stderr\": 0.01268201633564667,\n \"acc_norm\": 0.44132985658409385,\n\
\ \"acc_norm_stderr\": 0.01268201633564667\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681397,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681397\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6535947712418301,\n \"acc_stderr\": 0.01924978569171721,\n \
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.01924978569171721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6414925770737219,\n\
\ \"mc2_stderr\": 0.015103448074375492\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386779\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44351781652767247,\n \
\ \"acc_stderr\": 0.013684327592606163\n }\n}\n```"
repo_url: https://huggingface.co/fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|arc:challenge|25_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|gsm8k|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hellaswag|10_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T16-50-26.517326.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T16-50-26.517326.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- '**/details_harness|winogrande|5_2024-01-13T16-50-26.517326.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T16-50-26.517326.parquet'
- config_name: results
data_files:
- split: 2024_01_13T16_50_26.517326
path:
- results_2024-01-13T16-50-26.517326.parquet
- split: latest
path:
- results_2024-01-13T16-50-26.517326.parquet
---
# Dataset Card for Evaluation run of fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser](https://huggingface.co/fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fblgit__UNA-dolphin-2.6-mistral-7b-dpo-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T16:50:26.517326](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__UNA-dolphin-2.6-mistral-7b-dpo-laser/blob/main/results_2024-01-13T16-50-26.517326.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6337373508284295,
"acc_stderr": 0.032383816315627395,
"acc_norm": 0.6380892308683148,
"acc_norm_stderr": 0.03302967908991101,
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6414925770737219,
"mc2_stderr": 0.015103448074375492
},
"harness|arc:challenge|25": {
"acc": 0.6373720136518771,
"acc_stderr": 0.014049106564955009,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.013724978465537304
},
"harness|hellaswag|10": {
"acc": 0.6719776936865166,
"acc_stderr": 0.00468533484403866,
"acc_norm": 0.8630750846444931,
"acc_norm_stderr": 0.0034306550069275825
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.037827289808654685,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.037827289808654685
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130956,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130956
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887048,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887048
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010344,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010344
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437413,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437413
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389104,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389104
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287414,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287414
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990925,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990925
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834829,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834829
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134135,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134135
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3675977653631285,
"acc_stderr": 0.01612554382355295,
"acc_norm": 0.3675977653631285,
"acc_norm_stderr": 0.01612554382355295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.01268201633564667,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.01268201633564667
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681397,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681397
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.01924978569171721,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.01924978569171721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6414925770737219,
"mc2_stderr": 0.015103448074375492
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.011398593419386779
},
"harness|gsm8k|5": {
"acc": 0.44351781652767247,
"acc_stderr": 0.013684327592606163
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ycbq999/test1 | ---
dataset_info:
features:
- name: data
dtype: float64
splits:
- name: train
num_bytes: 80000
num_examples: 10000
download_size: 96279
dataset_size: 80000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
odunola/yoruba_audio_preprocessed | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 14139461126.75
num_examples: 11506
download_size: 5975715401
dataset_size: 14139461126.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_78_1713222620 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 212715
num_examples: 518
download_size: 111111
dataset_size: 212715
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MuGeminorum/HEp2 | ---
license: mit
task_categories:
- image-classification
language:
- en
tags:
- biology
- medical
pretty_name: HEp-2 Cell
size_categories:
- 10K<n<100K
# dataset_info:
# splits:
# - name: train
# num_examples: 10876
# - name: validation
# num_examples: 1360
# - name: test
# num_examples: 1360
---
# Dataset card for "MuGeminorum/HEp2"
The HEp-2 (Human Epithelial type 2) dataset is a widely utilized benchmark in the field of medical image analysis, particularly for the task of antinuclear antibody (ANA) pattern classification. This dataset comprises microscopic images of HEp-2 cells stained with fluorescent dyes, showcasing diverse patterns of autoantibody binding associated with various autoimmune diseases. Researchers and practitioners leverage the HEp-2 dataset to develop and assess algorithms for automating ANA pattern recognition, thereby aiding in the diagnosis of autoimmune disorders. The intricate patterns within the dataset challenge the robustness of computational models, making it a valuable resource for advancing the understanding of autoimmune diseases and contributing to the development of cutting-edge medical image analysis techniques.
## Usage
```python
from datasets import load_dataset
data = load_dataset("MuGeminorum/HEp2")
trainset = data["train"]
validset = data["validation"]
testset = data["test"]
labels = testset.features["label"].names
for item in trainset:
print("image: ", item["image"])
print("label name: " + labels[item["label"]])
for item in validset:
print("image: ", item["image"])
print("label name: " + labels[item["label"]])
for item in testset:
print("image: ", item["image"])
print("label name: " + labels[item["label"]])
```
## Maintenance
```bash
GIT_LFS_SKIP_SMUDGE=1 git clone git@hf.co:datasets/MuGeminorum/HEp2
```
## Mirror
<https://www.modelscope.cn/datasets/MuGeminorum/HEp2>
## Reference
[1] [Chapter III ‐ Classifying Cell Images Using Deep Learning Models](https://github.com/MuGeminorum/Medical_Image_Computing/wiki/Chapter-III-%E2%80%90-Classifying-Cell-Images-Using-Deep-Learning-Models)<br>
[2] <a href="https://arxiv.org/pdf/1504.02531v1.pdf">HEp-2 Cell Image Classification with Deep Convolutional Neural Networks</a> |
mtkinit/testAR | ---
pretty_name: testAR
---
# testAR
Created from AIOD platform |
tyzhu/find_last_sent_train_50_eval_10_hint3 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 135382
num_examples: 110
- name: validation
num_bytes: 9233
num_examples: 10
download_size: 81619
dataset_size: 144615
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_last_sent_train_50_eval_10_hint3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lonaz/elna | ---
license: openrail
---
|
hallisky/synthetic-imdb-movie-reviews-parallel | ---
license: apache-2.0
---
|
abhishek/autotrain-data-2lld-7hpl-t0wr | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': daisy
'1': dandelion
'2': rose
'3': sunflower
'4': tulip
splits:
- name: train
num_bytes: 114899538.888
num_examples: 2196
- name: validation
num_bytes: 33595965.0
num_examples: 550
download_size: 167022637
dataset_size: 148495503.888
---
# Dataset Card for "autotrain-data-2lld-7hpl-t0wr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
santi-restrepo-poli/personeria-de-medellin | ---
license: mit
---
|
JiggaBooJombs/Novel | ---
license: apache-2.0
---
|
RafaG/Dataset | ---
license: openrail
---
|
anan-2024/twitter_dataset_1713172193 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 220053
num_examples: 565
download_size: 116946
dataset_size: 220053
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mshenoda/diffugen_samples | ---
license: creativeml-openrail-m
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.