datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Dialogue-Model-Research-Group/baike | ---
license: cc
---
|
CyberHarem/kawashiro_mitori_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kawashiro_mitori/河城みとり (Touhou)
This is the dataset of kawashiro_mitori/河城みとり (Touhou), containing 25 images and their tags.
The core tags of this character are `hair_ornament, hat, short_hair, red_eyes, pink_hair, side_ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 21.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawashiro_mitori_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 15.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawashiro_mitori_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 44 | 26.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawashiro_mitori_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 20.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawashiro_mitori_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 44 | 31.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawashiro_mitori_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kawashiro_mitori_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | hair_bobbles, 1girl, solo, lock, layered_sleeves, blush, skirt, road_sign |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | hair_bobbles | 1girl | solo | lock | layered_sleeves | blush | skirt | road_sign |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------|:--------|:-------|:-------|:------------------|:--------|:--------|:------------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X |
|
iElexperio/processedMorDataLLMv3 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence: int64
- name: image
dtype: image
splits:
- name: train
num_bytes: 8868049.0
num_examples: 70
- name: test
num_bytes: 3462408.0
num_examples: 28
download_size: 11436065
dataset_size: 12330457.0
---
# Dataset Card for "processedMorDataLLMv3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_beberik__TinyExperts-v0-4x1B | ---
pretty_name: Evaluation run of beberik/TinyExperts-v0-4x1B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [beberik/TinyExperts-v0-4x1B](https://huggingface.co/beberik/TinyExperts-v0-4x1B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beberik__TinyExperts-v0-4x1B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-20T21:54:19.124713](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__TinyExperts-v0-4x1B/blob/main/results_2023-12-20T21-54-19.124713.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26252295554837873,\n\
\ \"acc_stderr\": 0.031072019491044735,\n \"acc_norm\": 0.2641174998312261,\n\
\ \"acc_norm_stderr\": 0.03186997959528178,\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.41126558330324914,\n\
\ \"mc2_stderr\": 0.014912649441030584\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.27047781569965873,\n \"acc_stderr\": 0.012980954547659554,\n\
\ \"acc_norm\": 0.31399317406143346,\n \"acc_norm_stderr\": 0.013562691224726295\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3906592312288389,\n\
\ \"acc_stderr\": 0.0048690101522807505,\n \"acc_norm\": 0.522903804023103,\n\
\ \"acc_norm_stderr\": 0.0049845435409323355\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.024959918028911274,\n\
\ \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.024959918028911274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n\
\ \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776568,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776568\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.03268454013011744,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.03268454013011744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.25483870967741934,\n \"acc_stderr\": 0.02479011845933221,\n \"\
acc_norm\": 0.25483870967741934,\n \"acc_norm_stderr\": 0.02479011845933221\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"\
acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198892,\n \"\
acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198892\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2282051282051282,\n \"acc_stderr\": 0.021278393863586282,\n\
\ \"acc_norm\": 0.2282051282051282,\n \"acc_norm_stderr\": 0.021278393863586282\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22385321100917432,\n \"acc_stderr\": 0.017871217767790222,\n \"\
acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.017871217767790222\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.029531221160930918,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.029531221160930918\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n \
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\"\
: 0.22784810126582278,\n \"acc_stderr\": 0.02730348459906941,\n \"\
acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.02730348459906941\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.29596412556053814,\n\
\ \"acc_stderr\": 0.030636591348699792,\n \"acc_norm\": 0.29596412556053814,\n\
\ \"acc_norm_stderr\": 0.030636591348699792\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.028120966503914397,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.028120966503914397\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n\
\ \"acc_stderr\": 0.0156960085638071,\n \"acc_norm\": 0.26053639846743293,\n\
\ \"acc_norm_stderr\": 0.0156960085638071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.02465968518596728,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.02465968518596728\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859063,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859063\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.258148631029987,\n\
\ \"acc_stderr\": 0.01117692371931339,\n \"acc_norm\": 0.258148631029987,\n\
\ \"acc_norm_stderr\": 0.01117692371931339\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23161764705882354,\n \"acc_stderr\": 0.025626533803777562,\n\
\ \"acc_norm\": 0.23161764705882354,\n \"acc_norm_stderr\": 0.025626533803777562\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250078,\n \
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250078\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.027529637440174923,\n\
\ \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.027529637440174923\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2289156626506024,\n\
\ \"acc_stderr\": 0.03270745277352477,\n \"acc_norm\": 0.2289156626506024,\n\
\ \"acc_norm_stderr\": 0.03270745277352477\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.034240429246915824,\n\
\ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.034240429246915824\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.41126558330324914,\n\
\ \"mc2_stderr\": 0.014912649441030584\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.601420678768745,\n \"acc_stderr\": 0.01376035717687383\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.002001305720948071\n }\n}\n```"
repo_url: https://huggingface.co/beberik/TinyExperts-v0-4x1B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|arc:challenge|25_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|gsm8k|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hellaswag|10_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T21-54-19.124713.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-20T21-54-19.124713.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- '**/details_harness|winogrande|5_2023-12-20T21-54-19.124713.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-20T21-54-19.124713.parquet'
- config_name: results
data_files:
- split: 2023_12_20T21_54_19.124713
path:
- results_2023-12-20T21-54-19.124713.parquet
- split: latest
path:
- results_2023-12-20T21-54-19.124713.parquet
---
# Dataset Card for Evaluation run of beberik/TinyExperts-v0-4x1B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [beberik/TinyExperts-v0-4x1B](https://huggingface.co/beberik/TinyExperts-v0-4x1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beberik__TinyExperts-v0-4x1B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-20T21:54:19.124713](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__TinyExperts-v0-4x1B/blob/main/results_2023-12-20T21-54-19.124713.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26252295554837873,
"acc_stderr": 0.031072019491044735,
"acc_norm": 0.2641174998312261,
"acc_norm_stderr": 0.03186997959528178,
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148128,
"mc2": 0.41126558330324914,
"mc2_stderr": 0.014912649441030584
},
"harness|arc:challenge|25": {
"acc": 0.27047781569965873,
"acc_stderr": 0.012980954547659554,
"acc_norm": 0.31399317406143346,
"acc_norm_stderr": 0.013562691224726295
},
"harness|hellaswag|10": {
"acc": 0.3906592312288389,
"acc_stderr": 0.0048690101522807505,
"acc_norm": 0.522903804023103,
"acc_norm_stderr": 0.0049845435409323355
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.024959918028911274,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.024959918028911274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776568,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776568
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011744,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.02479011845933221,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.02479011845933221
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.030313710538198892,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.030313710538198892
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2282051282051282,
"acc_stderr": 0.021278393863586282,
"acc_norm": 0.2282051282051282,
"acc_norm_stderr": 0.021278393863586282
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.017871217767790222,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.017871217767790222
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25,
"acc_stderr": 0.029531221160930918,
"acc_norm": 0.25,
"acc_norm_stderr": 0.029531221160930918
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22784810126582278,
"acc_stderr": 0.02730348459906941,
"acc_norm": 0.22784810126582278,
"acc_norm_stderr": 0.02730348459906941
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.29596412556053814,
"acc_stderr": 0.030636591348699792,
"acc_norm": 0.29596412556053814,
"acc_norm_stderr": 0.030636591348699792
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914397,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914397
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26053639846743293,
"acc_stderr": 0.0156960085638071,
"acc_norm": 0.26053639846743293,
"acc_norm_stderr": 0.0156960085638071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.02646903681859063,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.02646903681859063
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.258148631029987,
"acc_stderr": 0.01117692371931339,
"acc_norm": 0.258148631029987,
"acc_norm_stderr": 0.01117692371931339
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23161764705882354,
"acc_stderr": 0.025626533803777562,
"acc_norm": 0.23161764705882354,
"acc_norm_stderr": 0.025626533803777562
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250078,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250078
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.027529637440174923,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.027529637440174923
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148128,
"mc2": 0.41126558330324914,
"mc2_stderr": 0.014912649441030584
},
"harness|winogrande|5": {
"acc": 0.601420678768745,
"acc_stderr": 0.01376035717687383
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.002001305720948071
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CVdatasets/ImageNet15_animals_unbalanced_aug1 | ---
dataset_info:
features:
- name: labels
dtype:
class_label:
names:
'0': Italian_greyhound
'1': Coyote
'2': Beagle
'3': Rottweiler
'4': Hyena
'5': Greater_Swiss_Mountain_dog
'6': Triceratops
'7': French_bulldog
'8': Red_wolf
'9': Egyptian_cat
'10': Chihuahua
'11': Irish_terrier
'12': Tiger_cat
'13': White_wolf
'14': Timber_wolf
- name: img
dtype: image
- name: is_generated
dtype: bool
splits:
- name: validation
num_bytes: 60570648.125
num_examples: 1439
- name: train
num_bytes: 174270537.875
num_examples: 3705
download_size: 234762621
dataset_size: 234841186.0
---
# Dataset Card for "ImageNet15_animals_unbalanced_aug1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
umd-zhou-lab/Reflect_Wiz70_All | ---
dataset_info:
features:
- name: data
struct:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: origin
num_bytes: 130900545
num_examples: 70000
- name: reflect_instruction
num_bytes: 132137005
num_examples: 70000
- name: reflect_response
num_bytes: 170505414
num_examples: 70000
- name: reflect_both
num_bytes: 176166017
num_examples: 70000
download_size: 318571646
dataset_size: 609708981
---
# Dataset Card for "Reflect_Wiz70_All"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KoddaDuck/41_4 | ---
license: mit
---
|
bigbio/bioinfer |
---
language:
- en
bigbio_language:
- English
license: cc-by-2.0
multilinguality: monolingual
bigbio_license_shortname: CC_BY_2p0
pretty_name: BioInfer
homepage: https://github.com/metalrt/ppi-dataset
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- RELATION_EXTRACTION
- NAMED_ENTITY_RECOGNITION
---
# Dataset Card for BioInfer
## Dataset Description
- **Homepage:** https://github.com/metalrt/ppi-dataset
- **Pubmed:** True
- **Public:** True
- **Tasks:** RE,NER
A corpus targeted at protein, gene, and RNA relationships which serves as a
resource for the development of information extraction systems and their
components such as parsers and domain analyzers. Currently, the corpus contains
1100 sentences from abstracts of biomedical research articles annotated for
relationships, named entities, as well as syntactic dependencies.
## Citation Information
```
@article{pyysalo2007bioinfer,
title = {BioInfer: a corpus for information extraction in the biomedical domain},
author = {
Pyysalo, Sampo and Ginter, Filip and Heimonen, Juho and Bj{"o}rne, Jari
and Boberg, Jorma and J{"a}rvinen, Jouni and Salakoski, Tapio
},
year = 2007,
journal = {BMC bioinformatics},
publisher = {BioMed Central},
volume = 8,
number = 1,
pages = {1--24}
}
```
|
cahya/instructions-hi | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 47507115.72180105
num_examples: 49497
- name: test
num_bytes: 1250616.639099476
num_examples: 1303
- name: validation
num_bytes: 1250616.639099476
num_examples: 1303
download_size: 18697342
dataset_size: 50008349.00000001
---
# Dataset Card for "instructions-hi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_80_1713209820 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 1472271
num_examples: 3593
download_size: 720363
dataset_size: 1472271
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Josephgflowers__3BigReasonCinder | ---
pretty_name: Evaluation run of Josephgflowers/3BigReasonCinder
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Josephgflowers/3BigReasonCinder](https://huggingface.co/Josephgflowers/3BigReasonCinder)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__3BigReasonCinder\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T12:31:38.090504](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__3BigReasonCinder/blob/main/results_2024-02-09T12-31-38.090504.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44794266994933396,\n\
\ \"acc_stderr\": 0.03464503770381712,\n \"acc_norm\": 0.4507932379135084,\n\
\ \"acc_norm_stderr\": 0.03538213666564797,\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394816,\n \"mc2\": 0.44764087589469737,\n\
\ \"mc2_stderr\": 0.014703779857331185\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.39078498293515357,\n \"acc_stderr\": 0.014258563880513778,\n\
\ \"acc_norm\": 0.41723549488054607,\n \"acc_norm_stderr\": 0.014409825518403082\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4801832304321848,\n\
\ \"acc_stderr\": 0.004985860853427632,\n \"acc_norm\": 0.6515634335789683,\n\
\ \"acc_norm_stderr\": 0.004755013243022131\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.040335656678483184,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.040335656678483184\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.5094339622641509,\n \"acc_stderr\": 0.030767394707808093,\n \
\ \"acc_norm\": 0.5094339622641509,\n \"acc_norm_stderr\": 0.030767394707808093\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3930635838150289,\n\
\ \"acc_stderr\": 0.03724249595817729,\n \"acc_norm\": 0.3930635838150289,\n\
\ \"acc_norm_stderr\": 0.03724249595817729\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5096774193548387,\n\
\ \"acc_stderr\": 0.02843867799890955,\n \"acc_norm\": 0.5096774193548387,\n\
\ \"acc_norm_stderr\": 0.02843867799890955\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.03888176921674101,\n\
\ \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.03888176921674101\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5858585858585859,\n \"acc_stderr\": 0.03509438348879629,\n \"\
acc_norm\": 0.5858585858585859,\n \"acc_norm_stderr\": 0.03509438348879629\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5699481865284974,\n \"acc_stderr\": 0.03572954333144808,\n\
\ \"acc_norm\": 0.5699481865284974,\n \"acc_norm_stderr\": 0.03572954333144808\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3974358974358974,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.3974358974358974,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02578787422095931,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02578787422095931\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6036697247706422,\n \"acc_stderr\": 0.02097146994790053,\n \"\
acc_norm\": 0.6036697247706422,\n \"acc_norm_stderr\": 0.02097146994790053\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5490196078431373,\n \"acc_stderr\": 0.03492406104163613,\n \"\
acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.03492406104163613\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6075949367088608,\n \"acc_stderr\": 0.03178471874564729,\n \
\ \"acc_norm\": 0.6075949367088608,\n \"acc_norm_stderr\": 0.03178471874564729\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47085201793721976,\n\
\ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.47085201793721976,\n\
\ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.043482080516448585,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.043482080516448585\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624505,\n \"acc_norm\"\
: 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624505\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190192,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.0392237829061099,\n\
\ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.0392237829061099\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.049111471073657764,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.049111471073657764\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6623931623931624,\n\
\ \"acc_stderr\": 0.030980296992618558,\n \"acc_norm\": 0.6623931623931624,\n\
\ \"acc_norm_stderr\": 0.030980296992618558\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5389527458492975,\n\
\ \"acc_stderr\": 0.017825621793239012,\n \"acc_norm\": 0.5389527458492975,\n\
\ \"acc_norm_stderr\": 0.017825621793239012\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4884393063583815,\n \"acc_stderr\": 0.026911898686377913,\n\
\ \"acc_norm\": 0.4884393063583815,\n \"acc_norm_stderr\": 0.026911898686377913\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28044692737430166,\n\
\ \"acc_stderr\": 0.015024083883322891,\n \"acc_norm\": 0.28044692737430166,\n\
\ \"acc_norm_stderr\": 0.015024083883322891\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.028607893699576066,\n\
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.028607893699576066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.49517684887459806,\n\
\ \"acc_stderr\": 0.028396770444111298,\n \"acc_norm\": 0.49517684887459806,\n\
\ \"acc_norm_stderr\": 0.028396770444111298\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.027648477877413327,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.027648477877413327\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.028999080904806178,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.028999080904806178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3272490221642764,\n\
\ \"acc_stderr\": 0.01198381980646473,\n \"acc_norm\": 0.3272490221642764,\n\
\ \"acc_norm_stderr\": 0.01198381980646473\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33455882352941174,\n \"acc_stderr\": 0.028661996202335307,\n\
\ \"acc_norm\": 0.33455882352941174,\n \"acc_norm_stderr\": 0.028661996202335307\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.41830065359477125,\n \"acc_stderr\": 0.019955975145835542,\n \
\ \"acc_norm\": 0.41830065359477125,\n \"acc_norm_stderr\": 0.019955975145835542\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n\
\ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n\
\ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.038295098689947266,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.038295098689947266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394816,\n \"mc2\": 0.44764087589469737,\n\
\ \"mc2_stderr\": 0.014703779857331185\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6495659037095501,\n \"acc_stderr\": 0.01340904767667018\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2759666413949962,\n \
\ \"acc_stderr\": 0.012312603010427352\n }\n}\n```"
repo_url: https://huggingface.co/Josephgflowers/3BigReasonCinder
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|arc:challenge|25_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|gsm8k|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hellaswag|10_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-31-38.090504.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T12-31-38.090504.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- '**/details_harness|winogrande|5_2024-02-09T12-31-38.090504.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T12-31-38.090504.parquet'
- config_name: results
data_files:
- split: 2024_02_09T12_31_38.090504
path:
- results_2024-02-09T12-31-38.090504.parquet
- split: latest
path:
- results_2024-02-09T12-31-38.090504.parquet
---
# Dataset Card for Evaluation run of Josephgflowers/3BigReasonCinder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/3BigReasonCinder](https://huggingface.co/Josephgflowers/3BigReasonCinder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__3BigReasonCinder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T12:31:38.090504](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__3BigReasonCinder/blob/main/results_2024-02-09T12-31-38.090504.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.44794266994933396,
"acc_stderr": 0.03464503770381712,
"acc_norm": 0.4507932379135084,
"acc_norm_stderr": 0.03538213666564797,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394816,
"mc2": 0.44764087589469737,
"mc2_stderr": 0.014703779857331185
},
"harness|arc:challenge|25": {
"acc": 0.39078498293515357,
"acc_stderr": 0.014258563880513778,
"acc_norm": 0.41723549488054607,
"acc_norm_stderr": 0.014409825518403082
},
"harness|hellaswag|10": {
"acc": 0.4801832304321848,
"acc_stderr": 0.004985860853427632,
"acc_norm": 0.6515634335789683,
"acc_norm_stderr": 0.004755013243022131
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.040335656678483184,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.040335656678483184
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5094339622641509,
"acc_stderr": 0.030767394707808093,
"acc_norm": 0.5094339622641509,
"acc_norm_stderr": 0.030767394707808093
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.03724249595817729,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.03724249595817729
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5096774193548387,
"acc_stderr": 0.02843867799890955,
"acc_norm": 0.5096774193548387,
"acc_norm_stderr": 0.02843867799890955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.03888176921674101,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.03888176921674101
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5858585858585859,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.5858585858585859,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5699481865284974,
"acc_stderr": 0.03572954333144808,
"acc_norm": 0.5699481865284974,
"acc_norm_stderr": 0.03572954333144808
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3974358974358974,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.3974358974358974,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02578787422095931,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02578787422095931
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6036697247706422,
"acc_stderr": 0.02097146994790053,
"acc_norm": 0.6036697247706422,
"acc_norm_stderr": 0.02097146994790053
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6075949367088608,
"acc_stderr": 0.03178471874564729,
"acc_norm": 0.6075949367088608,
"acc_norm_stderr": 0.03178471874564729
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.47085201793721976,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.47085201793721976,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.043482080516448585,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.043482080516448585
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624505,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624505
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190192,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5276073619631901,
"acc_stderr": 0.0392237829061099,
"acc_norm": 0.5276073619631901,
"acc_norm_stderr": 0.0392237829061099
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.049111471073657764,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.049111471073657764
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6623931623931624,
"acc_stderr": 0.030980296992618558,
"acc_norm": 0.6623931623931624,
"acc_norm_stderr": 0.030980296992618558
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5389527458492975,
"acc_stderr": 0.017825621793239012,
"acc_norm": 0.5389527458492975,
"acc_norm_stderr": 0.017825621793239012
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4884393063583815,
"acc_stderr": 0.026911898686377913,
"acc_norm": 0.4884393063583815,
"acc_norm_stderr": 0.026911898686377913
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28044692737430166,
"acc_stderr": 0.015024083883322891,
"acc_norm": 0.28044692737430166,
"acc_norm_stderr": 0.015024083883322891
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.028607893699576066,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.028607893699576066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.49517684887459806,
"acc_stderr": 0.028396770444111298,
"acc_norm": 0.49517684887459806,
"acc_norm_stderr": 0.028396770444111298
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.027648477877413327,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.027648477877413327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.028999080904806178,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.028999080904806178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3272490221642764,
"acc_stderr": 0.01198381980646473,
"acc_norm": 0.3272490221642764,
"acc_norm_stderr": 0.01198381980646473
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33455882352941174,
"acc_stderr": 0.028661996202335307,
"acc_norm": 0.33455882352941174,
"acc_norm_stderr": 0.028661996202335307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.41830065359477125,
"acc_stderr": 0.019955975145835542,
"acc_norm": 0.41830065359477125,
"acc_norm_stderr": 0.019955975145835542
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.038295098689947266,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.038295098689947266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394816,
"mc2": 0.44764087589469737,
"mc2_stderr": 0.014703779857331185
},
"harness|winogrande|5": {
"acc": 0.6495659037095501,
"acc_stderr": 0.01340904767667018
},
"harness|gsm8k|5": {
"acc": 0.2759666413949962,
"acc_stderr": 0.012312603010427352
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-eval-lener_br-lener_br-c186f5-1776861659 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- lener_br
eval_info:
task: entity_extraction
model: Luciano/bertimbau-base-lener_br
metrics: []
dataset_name: lener_br
dataset_config: lener_br
dataset_split: train
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: Luciano/bertimbau-base-lener_br
* Dataset: lener_br
* Config: lener_br
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Luciano](https://huggingface.co/Luciano) for evaluating this model. |
HeTree/MevakerConcSen | ---
license: apache-2.0
language:
- he
---
## MevakerConcSen
A sentence-level dataset for sentence-level conclusion extraction which provides a label of conclusion/not conclusion (1/0 respectivly) for each sentence
together with indexes of the sentence and their document of origin.
### Citing
If you use MevakerConcSen in your research, please cite [Mevaker: Conclusion Extraction and Allocation Resources for the Hebrew Language](https://arxiv.org/abs/2403.09719).
```
@article{shalumov2024mevaker,
title={Mevaker: Conclusion Extraction and Allocation Resources for the Hebrew Language},
author={Vitaly Shalumov and Harel Haskey and Yuval Solaz},
year={2024},
eprint={2403.09719},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
karukas/arxiv-abstract-matching | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: train
num_bytes: 7119340064
num_examples: 203037
- name: validation
num_bytes: 216202656
num_examples: 6436
- name: test
num_bytes: 216585242
num_examples: 6440
download_size: 3635681697
dataset_size: 7552127962
---
# Dataset Card for "arxiv-abstract-matching"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DazMashaly/test_fake_labels | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: test
num_bytes: 365263950.94
num_examples: 5108
download_size: 354753479
dataset_size: 365263950.94
---
# Dataset Card for "test_fake_labels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dukkkk/test | ---
annotations_creators: []
language_creators: []
language:
- zh
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: Wenetspeech4TTS
source_datasets: []
task_categories:
- automatic-speech-recognition
- text-to-speech
- text-to-audio
extra_gated_prompt: >-
We do not own the copyright of the audio files. For researchers and
educators who wish to use the audio files for non-commercial research and/or
educational purposes, we can provide access through the Hub under certain
conditions and terms. Terms of Access: The Researcher has requested
permission to use the WenetSpeech4TTS database. In exchange for such
permission, Researcher hereby agrees to the following terms and conditions:
1. Researcher shall use the Database only for non-commercial research and
educational purposes.
2. The authors make no representations or warranties
regarding the Database, including but not limited to warranties of
non-infringement or fitness for a particular purpose.
3. Researcher accepts
full responsibility for his or her use of the Database and shall defend and
indemnify the authors of WenetSpeech4TTS, including their employees, Trustees,
officers and agents, against any and all claims arising from Researcher's use
of the Database, including but not limited to Researcher's use of any copies
of copyrighted audio files that he or she may create from the Database.
4.Researcher may provide research associates and colleagues with access to the
Database provided that they first agree to be bound by these terms and
conditions.
5. The authors reserve the right to terminate Researcher's access
to the Database at any time.
6. If Researcher is employed by a for-profit,
commercial entity, Researcher's employer shall also be bound by these terms
and conditions, and Researcher hereby represents that he or she is fully
authorized to enter into this agreement on behalf of such employer.
extra_gated_fields:
Name: text
Email: text
Organization: text
Address: text
I hereby confirm that I have requested access via the Google Form provided above: checkbox
I accept the terms of access: checkbox
size_categories:
- 1M<n<10M
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nespc/cnn_dailymail_prompts | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1354728397
num_examples: 287113
- name: test
num_bytes: 53648492
num_examples: 11490
download_size: 781011544
dataset_size: 1408376889
---
# Dataset Card for "cnn_dailymail_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Verah/JParaCrawl-Filtered-English-Japanese-Parallel-Corpus | ---
license: other
license_name: ntt-research
license_link: https://www.kecl.ntt.co.jp/icl/lirg/jparacrawl/
task_categories:
- translation
language:
- en
- ja
size_categories:
- 1M<n<10M
---
# Introduction
This is a LLM-filtered set of the first 1M rows from ntt's JParaCrawl v3 large English-Japanese parallel corpus.
The original JParaCrawl corpus was put together by automated means - aligning Japanese texts with their apparent English translations that were found in-the-wild, on the internet.
Whilst manually browsing the original data, I noticed that there were obvious quality issues that made me anxious about using the dataset at all. Poorly aligned translations, incomplete translations, etc.
The goal of this dataset is to split the entire original dataset into its good and bad parts to:
- facilitate further research
- make available a high quality dataset
- investigate the performance of various LLMs at evaluating the dataset.
The new upload includes filtering by an additional LLM, its results are "model2_accepted": https://huggingface.co/Verah/mistral-japanese-stabalelm-merge
I merged mistral instruct with stability AI's new japanese LLM, and this seems to have resulted in a model with enough knowledge of english and japanese to be competent at this task.
It is likely that a finetune would further improve results.
This new model accepted only 260,058 rows from the 1M seen, whilst the previous model was around twice as permissive.
Prompt used with the new model:
```python
from inspect import cleandoc
def promptgen_mistral(japanese :str, english :str) -> str:
system_prompt = cleandoc("""<s>[INST]Your role is to evaluate the accuracy of the provided Japanese to English translation.
- Translations with parts missing should be rejected.
- Incomplete translations should be rejected.
- Inaccurate translations should be rejected.
- Poor grammar should be rejected.
- Any kind of mistake should be rejected.
- Bad spelling should be rejected.
- Low quality english should be rejected.
- Low quality japanese should be rejected.
- high quality translations should be accepted.
- Respond with only 'ACCEPT' or 'REJECT'.
""")
return system_prompt + f"JAPANESE: {japanese}\nENGLISH: {english}[/INST]\n"
```
# License
The license is identical to the original JParaCrawl dataset:
```
Terms of Use for Bilingual Data, Monolingual Data and Trained Models
Nippon Telegraph and Telephone Corporation (Hereinafter referred to as "our company".) will provide bilingual data, monolingual data and trained models (Hereinafter referred to as "this data.") subject to your acceptance of these Terms of Use. We assume that you have agreed to these Terms of Use when you start using this data (including downloads).
Article 1 (Use conditions)
This data can only be used for research purposes involving information analysis (Including, but not limited to, replication and distribution. Hereinafter the same in this article.). The same applies to the derived data created based on this data. However, this data is not available for commercial use, including the sale of translators trained using this data.
Article 2 (Disclaimer)
Our company does not warrant the quality, performance or any other aspects of this data. We shall not be liable for any direct or indirect damages caused by the use of this data. Our company shall not be liable for any damage to the system caused by the installation of this data.
Article 3 (Other).
This data may be changed in whole or in part, or provision of this data may be interrupted or stopped at our company’s discretion without prior notice.
==========
対訳データ,単言語データおよび学習済みモデル利用に関する利用規約
日本電信電話株式会社(以下、「当社」という。)は、本利用規約に同意されることを条件として、対訳データ、単言語データおよび学習済みモデル(以下、「本データ」という。)を提供します。なお、本データの利用(ダウンロードも含む)を開始した時点で、本利用規約にご同意頂いたものとみなします。
第1条(利用条件)
本データは、情報解析を伴う研究開発目的にのみご利用(複製および配布を含むが、それに限らない。以下、同じ)頂けます。本データを基に作成された派生データについても同様です。ただし、本データを使って学習したデータを内蔵した翻訳機の販売等を含む商用利用目的には、ご利用頂けません。
第2条(免責)
当社は、本データについて、品質、性能その他一切の保証を行うものではありません。2.直接的損害、間接的損害を問わず、本データの利用によって生ずるいかなる損害についても、一切の責任を負いません。当社は、本データのインストール作業等によって発生するシステムへの影響等、損害についても、一切の責任を負いません。
第3条(その他)
事前通知なしに、当社の判断によって、本データを全部または一部の変更、本データの提供の中断または停止をさせて頂くことがございます。
``` |
maximalmargin/katz | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1400704.0
num_examples: 26
download_size: 1402106
dataset_size: 1400704.0
---
# Dataset Card for "katz"
Images from [Alex Katz](https://www.alexkatz.com/)'s Print Archive.
Hand-written image descriptions.
Please use responsibly. |
bigbio/bioasq_2021_mesinesp |
---
language:
- es
bigbio_language:
- Spanish
license: cc-by-4.0
multilinguality: monolingual
bigbio_license_shortname: CC_BY_4p0
pretty_name: MESINESP 2021
homepage: https://zenodo.org/record/5602914#.YhSXJ5PMKWt
bigbio_pubmed: False
bigbio_public: True
bigbio_tasks:
- TEXT_CLASSIFICATION
---
# Dataset Card for MESINESP 2021
## Dataset Description
- **Homepage:** https://zenodo.org/record/5602914#.YhSXJ5PMKWt
- **Pubmed:** False
- **Public:** True
- **Tasks:** TXTCLASS
The main aim of MESINESP2 is to promote the development of practically relevant semantic indexing tools for biomedical content in non-English language. We have generated a manually annotated corpus, where domain experts have labeled a set of scientific literature, clinical trials, and patent abstracts. All the documents were labeled with DeCS descriptors, which is a structured controlled vocabulary created by BIREME to index scientific publications on BvSalud, the largest database of scientific documents in Spanish, which hosts records from the databases LILACS, MEDLINE, IBECS, among others.
MESINESP track at BioASQ9 explores the efficiency of systems for assigning DeCS to different types of biomedical documents. To that purpose, we have divided the task into three subtracks depending on the document type. Then, for each one we generated an annotated corpus which was provided to participating teams:
- [Subtrack 1 corpus] MESINESP-L – Scientific Literature: It contains all Spanish records from LILACS and IBECS databases at the Virtual Health Library (VHL) with non-empty abstract written in Spanish.
- [Subtrack 2 corpus] MESINESP-T- Clinical Trials contains records from Registro Español de Estudios Clínicos (REEC). REEC doesn't provide documents with the structure title/abstract needed in BioASQ, for that reason we have built artificial abstracts based on the content available in the data crawled using the REEC API.
- [Subtrack 3 corpus] MESINESP-P – Patents: This corpus includes patents in Spanish extracted from Google Patents which have the IPC code “A61P” and “A61K31”. In addition, we also provide a set of complementary data such as: the DeCS terminology file, a silver standard with the participants' predictions to the task background set and the entities of medications, diseases, symptoms and medical procedures extracted from the BSC NERs documents.
## Citation Information
```
@conference {396,
title = {Overview of BioASQ 2021-MESINESP track. Evaluation of
advance hierarchical classification techniques for scientific
literature, patents and clinical trials.},
booktitle = {Proceedings of the 9th BioASQ Workshop
A challenge on large-scale biomedical semantic indexing
and question answering},
year = {2021},
url = {http://ceur-ws.org/Vol-2936/paper-11.pdf},
author = {Gasco, Luis and Nentidis, Anastasios and Krithara, Anastasia
and Estrada-Zavala, Darryl and Toshiyuki Murasaki, Renato and Primo-Pe{\~n}a,
Elena and Bojo-Canales, Cristina and Paliouras, Georgios and Krallinger, Martin}
}
```
|
Iania/QA_setup | ---
license: apache-2.0
---
|
roa7n/patched_test_p_10_m1_predictions_v2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
- name: m1_preds
dtype: float32
splits:
- name: train
num_bytes: 1566287530
num_examples: 2843834
download_size: 138365947
dataset_size: 1566287530
---
# Dataset Card for "patched_test_p_10_m1_predictions_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_dillfrescott__amadeus-v0.1 | ---
pretty_name: Evaluation run of dillfrescott/amadeus-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dillfrescott/amadeus-v0.1](https://huggingface.co/dillfrescott/amadeus-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dillfrescott__amadeus-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T01:28:19.231223](https://huggingface.co/datasets/open-llm-leaderboard/details_dillfrescott__amadeus-v0.1/blob/main/results_2024-01-06T01-28-19.231223.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6503116864878898,\n\
\ \"acc_stderr\": 0.03211845742165151,\n \"acc_norm\": 0.6514191578913137,\n\
\ \"acc_norm_stderr\": 0.032765902396781794,\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6382334765973684,\n\
\ \"mc2_stderr\": 0.01550846970253108\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6578498293515358,\n \"acc_stderr\": 0.013864152159177278,\n\
\ \"acc_norm\": 0.689419795221843,\n \"acc_norm_stderr\": 0.013522292098053069\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6957777335192192,\n\
\ \"acc_stderr\": 0.004591369853276529,\n \"acc_norm\": 0.8698466440948018,\n\
\ \"acc_norm_stderr\": 0.0033578442491239546\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.022755204959542943,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.022755204959542943\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n\
\ \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n\
\ \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n\
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n\
\ \"acc_stderr\": 0.01658868086453063,\n \"acc_norm\": 0.43687150837988825,\n\
\ \"acc_norm_stderr\": 0.01658868086453063\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137904,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137904\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545443,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545443\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6382334765973684,\n\
\ \"mc2_stderr\": 0.01550846970253108\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.01125195828120508\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6413949962092494,\n \
\ \"acc_stderr\": 0.013210317364134031\n }\n}\n```"
repo_url: https://huggingface.co/dillfrescott/amadeus-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|arc:challenge|25_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|gsm8k|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hellaswag|10_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T01-28-19.231223.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T01-28-19.231223.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- '**/details_harness|winogrande|5_2024-01-06T01-28-19.231223.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T01-28-19.231223.parquet'
- config_name: results
data_files:
- split: 2024_01_06T01_28_19.231223
path:
- results_2024-01-06T01-28-19.231223.parquet
- split: latest
path:
- results_2024-01-06T01-28-19.231223.parquet
---
# Dataset Card for Evaluation run of dillfrescott/amadeus-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dillfrescott/amadeus-v0.1](https://huggingface.co/dillfrescott/amadeus-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dillfrescott__amadeus-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T01:28:19.231223](https://huggingface.co/datasets/open-llm-leaderboard/details_dillfrescott__amadeus-v0.1/blob/main/results_2024-01-06T01-28-19.231223.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6503116864878898,
"acc_stderr": 0.03211845742165151,
"acc_norm": 0.6514191578913137,
"acc_norm_stderr": 0.032765902396781794,
"mc1": 0.46266829865361075,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6382334765973684,
"mc2_stderr": 0.01550846970253108
},
"harness|arc:challenge|25": {
"acc": 0.6578498293515358,
"acc_stderr": 0.013864152159177278,
"acc_norm": 0.689419795221843,
"acc_norm_stderr": 0.013522292098053069
},
"harness|hellaswag|10": {
"acc": 0.6957777335192192,
"acc_stderr": 0.004591369853276529,
"acc_norm": 0.8698466440948018,
"acc_norm_stderr": 0.0033578442491239546
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.022755204959542943,
"acc_norm": 0.8,
"acc_norm_stderr": 0.022755204959542943
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.01658868086453063,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.01658868086453063
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137904,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.027778298701545443,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.027778298701545443
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46266829865361075,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6382334765973684,
"mc2_stderr": 0.01550846970253108
},
"harness|winogrande|5": {
"acc": 0.7995264404104183,
"acc_stderr": 0.01125195828120508
},
"harness|gsm8k|5": {
"acc": 0.6413949962092494,
"acc_stderr": 0.013210317364134031
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ctu-aic/qacg-sum | ---
dataset_info:
- config_name: balanced
features:
- name: claim
dtype: string
- name: label
dtype: string
- name: evidence
sequence: string
- name: lang
dtype: string
- name: orig_idx
dtype: int64
splits:
- name: train
num_bytes: 130783710
num_examples: 1180836
- name: validation
num_bytes: 13391571
num_examples: 120348
- name: test
num_bytes: 12599211
num_examples: 113760
download_size: 114959179
dataset_size: 156774492
- config_name: balanced_shuf
features:
- name: claim
dtype: string
- name: label
dtype: string
- name: evidence
sequence: string
- name: lang
dtype: string
- name: orig_idx
dtype: int64
splits:
- name: train
num_bytes: 81658504
num_examples: 741542
- name: validation
num_bytes: 8349339
num_examples: 75573
- name: test
num_bytes: 7871047
num_examples: 71607
download_size: 71188503
dataset_size: 97878890
configs:
- config_name: balanced
data_files:
- split: train
path: balanced/train-*
- split: validation
path: balanced/validation-*
- split: test
path: balanced/test-*
- config_name: balanced_shuf
data_files:
- split: train
path: balanced_shuf/train-*
- split: validation
path: balanced_shuf/validation-*
- split: test
path: balanced_shuf/test-*
---
|
Nexdata/Mandarin_Speech_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Mandarin_Speech_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/35?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
It collects 6,278 speakers' dat from 33 provinces of China. 2,980 males and 3,298 females. The recording contents are commonly used colloquial sentences. It is recorded in both quiet and noisy environment. Annotated texts are transcribed and proofread by professional annotators. The accuracy is not less than 98%.
For more details, please refer to the link: https://www.nexdata.ai/datasets/35?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Mandarin
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
CyberHarem/ayase_arisa_lovelive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ayase_arisa/絢瀬亜里沙 (Love Live!)
This is the dataset of ayase_arisa/絢瀬亜里沙 (Love Live!), containing 163 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, long_hair, hair_ornament, hairclip`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 163 | 120.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayase_arisa_lovelive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 163 | 95.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayase_arisa_lovelive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 334 | 174.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayase_arisa_lovelive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 163 | 115.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayase_arisa_lovelive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 334 | 204.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayase_arisa_lovelive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ayase_arisa_lovelive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, open_mouth, serafuku, skirt, solo, simple_background, white_background, smile |
| 1 | 9 |  |  |  |  |  | 2girls, blush, open_mouth, skirt, serafuku, :d, solo_focus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | open_mouth | serafuku | skirt | solo | simple_background | white_background | smile | 2girls | :d | solo_focus |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------------|:-----------|:--------|:-------|:--------------------|:-------------------|:--------|:---------|:-----|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | |
| 1 | 9 |  |  |  |  |  | | X | | X | X | X | | | | | X | X | X |
|
nidnoiewoifehw/yocleash | ---
license: gpl-3.0
---
|
nilekhet/Spectrum-Dataset | ---
license: wtfpl
---
Dataset Card: Spectrum-Dataset 🌈
🌐 Source: [nilekhet/Spectrum · Hugging Face](https://huggingface.co/nilekhet/Spectrum)
📁 Supplementary Dataset: Spectrum-Dataset 🌟
🔗 Associated Model: Spectrum Model 🧬
## 🔍 bengin_generator.py 👨💻
* 📂 Recursively walks through folders
* 🚫 Skips unallowed items
* 🔄 Copies .exe files to destination folder
## 🔍 malfamily.py 👩💻
* 🌐 Scrapes malware family links
* 📥 Downloads and organizes malware samples
* 🗂️ Saves data as a .csv file
## 🔍 Rust code for image generation 🎨
* 🌐 GitHub: https://github.com/nileshkhetrapal/spectrum
* 🖼️ Generates images from the code
## 🎯 Intended Use of the Model 🌟
* 💻🔧 Classify malware based on input images
* 🛡️💻 Improve computer and network security
* 🌐 Help with malware detection and prevention
# 📊 Number of Classes: 1️⃣1️⃣9️⃣
* 🦠 Includes benign class |
joey234/mmlu-conceptual_physics-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
- name: neg_prompt
dtype: string
splits:
- name: dev
num_bytes: 5977
num_examples: 5
- name: test
num_bytes: 1347765
num_examples: 235
download_size: 155122
dataset_size: 1353742
---
# Dataset Card for "mmlu-conceptual_physics-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rajendrabaskota/hc3-wiki-cleaned-text-for-domain-classification-roberta-tokenized-max-len-512 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: text
dtype: string
- name: source
dtype: int64
- name: human/ai
dtype: int64
- name: perplexity
dtype: float64
- name: cleaned_text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 845606936
num_examples: 330345
- name: test
num_bytes: 44570090
num_examples: 17387
download_size: 499405861
dataset_size: 890177026
---
# Dataset Card for "hc3-wiki-cleaned-text-for-domain-classification-roberta-tokenized-max-len-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Isamu136/chess-gpt-data | ---
license: apache-2.0
---
|
dsupa/hack5-IQ-HP | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
splits:
- name: train
num_bytes: 2171810.0
num_examples: 647
download_size: 1814705
dataset_size: 2171810.0
---
# Dataset Card for "hack5-IQ-HP"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BitTranslate/chatgpt-prompts-Ukrainian | ---
license: cc0-1.0
language:
- uk
tags:
- ChatGPT
--- |
Joe02/quinn_refs | ---
license: other
---
|
liuyanchen1015/VALUE_stsb_dey_it | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 13139
num_examples: 69
- name: test
num_bytes: 6243
num_examples: 48
- name: train
num_bytes: 7725
num_examples: 40
download_size: 27352
dataset_size: 27107
---
# Dataset Card for "VALUE_stsb_dey_it"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adamjweintraut/bart-finetuned-eli5_lfqa_best_slice-256_2023-12-10_run | ---
dataset_info:
features:
- name: index
dtype: int64
- name: q_id
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: target
dtype: string
- name: predicted
dtype: string
- name: label
dtype: string
- name: rougeL_P
dtype: float64
- name: rougeL_R
dtype: float64
- name: rougeL_F
dtype: float64
- name: Cosine_Sim
dtype: float64
- name: nli-roberta_label
dtype: string
- name: nli-roberta_plot_vals
dtype: int64
- name: nli-roberta-max-score
dtype: float64
- name: sent_sim
dtype: float32
- name: context_answer_sim
dtype: float32
- name: rougeL_min_precision
dtype: float64
- name: rougeL_min_recall
dtype: float64
- name: rougeL_min_fmeasure
dtype: float64
- name: rougeL_median_precision
dtype: float64
- name: rougeL_median_recall
dtype: float64
- name: rougeL_median_fmeasure
dtype: float64
- name: rougeL_max_precision
dtype: float64
- name: rougeL_max_recall
dtype: float64
- name: rougeL_max_fmeasure
dtype: float64
- name: context_predicted_sim
dtype: float32
- name: context_label_sim
dtype: float32
- name: predicted_label_sim
dtype: float32
- name: nli_context_predicted_label
dtype: string
- name: nli_context_predicted_plots
dtype: int64
splits:
- name: train
num_bytes: 8354389
num_examples: 1250
download_size: 5113622
dataset_size: 8354389
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_jan-hq__trinity-v1 | ---
pretty_name: Evaluation run of jan-hq/trinity-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jan-hq/trinity-v1](https://huggingface.co/jan-hq/trinity-v1) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jan-hq__trinity-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T19:24:08.553660](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__trinity-v1/blob/main/results_2023-12-16T19-24-08.553660.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6575877329335247,\n\
\ \"acc_stderr\": 0.031985421208388404,\n \"acc_norm\": 0.6571647268300141,\n\
\ \"acc_norm_stderr\": 0.032648337921958155,\n \"mc1\": 0.5507955936352509,\n\
\ \"mc1_stderr\": 0.01741294198611529,\n \"mc2\": 0.6931209356367747,\n\
\ \"mc2_stderr\": 0.015031530031665238\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6988054607508533,\n \"acc_stderr\": 0.013406741767847632,\n\
\ \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059376\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.711113324039036,\n\
\ \"acc_stderr\": 0.004523188431142894,\n \"acc_norm\": 0.8835889265086636,\n\
\ \"acc_norm_stderr\": 0.0032006176493464752\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.03496101481191179,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.03496101481191179\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512625,\n \"\
acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512625\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.033368203384760736,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.033368203384760736\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323797,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323797\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4759776536312849,\n\
\ \"acc_stderr\": 0.016703190189300186,\n \"acc_norm\": 0.4759776536312849,\n\
\ \"acc_norm_stderr\": 0.016703190189300186\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533131,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533131\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5507955936352509,\n\
\ \"mc1_stderr\": 0.01741294198611529,\n \"mc2\": 0.6931209356367747,\n\
\ \"mc2_stderr\": 0.015031530031665238\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.01079646868806868\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7164518574677786,\n \
\ \"acc_stderr\": 0.012415070917508124\n }\n}\n```"
repo_url: https://huggingface.co/jan-hq/trinity-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|arc:challenge|25_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|gsm8k|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hellaswag|10_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-24-08.553660.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T19-24-08.553660.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- '**/details_harness|winogrande|5_2023-12-16T19-24-08.553660.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T19-24-08.553660.parquet'
- config_name: results
data_files:
- split: 2023_12_16T19_24_08.553660
path:
- results_2023-12-16T19-24-08.553660.parquet
- split: latest
path:
- results_2023-12-16T19-24-08.553660.parquet
---
# Dataset Card for Evaluation run of jan-hq/trinity-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jan-hq/trinity-v1](https://huggingface.co/jan-hq/trinity-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jan-hq__trinity-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T19:24:08.553660](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__trinity-v1/blob/main/results_2023-12-16T19-24-08.553660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6575877329335247,
"acc_stderr": 0.031985421208388404,
"acc_norm": 0.6571647268300141,
"acc_norm_stderr": 0.032648337921958155,
"mc1": 0.5507955936352509,
"mc1_stderr": 0.01741294198611529,
"mc2": 0.6931209356367747,
"mc2_stderr": 0.015031530031665238
},
"harness|arc:challenge|25": {
"acc": 0.6988054607508533,
"acc_stderr": 0.013406741767847632,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059376
},
"harness|hellaswag|10": {
"acc": 0.711113324039036,
"acc_stderr": 0.004523188431142894,
"acc_norm": 0.8835889265086636,
"acc_norm_stderr": 0.0032006176493464752
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.03496101481191179,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.03496101481191179
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.01517314184512625,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.01517314184512625
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.033368203384760736,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.033368203384760736
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323797,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323797
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4759776536312849,
"acc_stderr": 0.016703190189300186,
"acc_norm": 0.4759776536312849,
"acc_norm_stderr": 0.016703190189300186
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533131,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533131
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5507955936352509,
"mc1_stderr": 0.01741294198611529,
"mc2": 0.6931209356367747,
"mc2_stderr": 0.015031530031665238
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.01079646868806868
},
"harness|gsm8k|5": {
"acc": 0.7164518574677786,
"acc_stderr": 0.012415070917508124
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MexIvanov/CodeExercise-Python-27k-ru | ---
license: cc-by-nc-sa-4.0
language:
- ru
tags:
- Python
- code
---
A machine translated version of the codefuse-ai/CodeExercise-Python-27k dataset.
Consists of synthetically generated code with code-related data and natural language instructions.
Released under the same license as the original dataset, provided as is with research intent, use/read at your own risk. |
distilabel-internal-testing/airoboros-3.2-writing-oai-style-tiny | ---
dataset_info:
features:
- name: id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 56595.216657287565
num_examples: 10
download_size: 37556
dataset_size: 56595.216657287565
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FlyingFishzzz/source_test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: image_seg
dtype: image
- name: landmarks
dtype: string
- name: spiga
sequence:
sequence: float64
- name: spiga_seg
dtype: image
- name: image_caption
dtype: string
splits:
- name: train
num_bytes: 488488715.0
num_examples: 1588
download_size: 487390223
dataset_size: 488488715.0
---
# Dataset Card for "source_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thrisha15/Content_Generation_dataset | ---
language:
- en
--- |
HydraLM/partitioned_v2_standardized_014 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
splits:
- name: train
num_bytes: 64103762.64755713
num_examples: 125409
download_size: 18947842
dataset_size: 64103762.64755713
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v2_standardized_014"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/Open_Platypus_standardized_cluster_1_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 7549475
num_examples: 7230
download_size: 0
dataset_size: 7549475
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Open_Platypus_standardized_cluster_1_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lachieandmitch/hugging | ---
license: apache-2.0
---
|
Someman/hindi-summarization | ---
license: mit
task_categories:
- summarization
language: hi
original_source: >-
https://www.kaggle.com/datasets/disisbig/hindi-text-short-and-large-summarization-corpus
dataset_info:
features:
- name: headline
dtype: string
- name: summary
dtype: string
- name: article
dtype: string
splits:
- name: train
num_bytes: 410722079.5542422
num_examples: 55226
- name: test
num_bytes: 102684238.44575782
num_examples: 13807
- name: valid
num_bytes: 128376473
num_examples: 17265
download_size: 150571314
dataset_size: 641782791
pretty_name: hindi summarization
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
## Dataset Description
- Homepage: https://www.kaggle.com/datasets/disisbig/hindi-text-short-and-large-summarization-corpus?select=test.csv
### Dataset Summary
Hindi Text Short and Large Summarization Corpus is a collection of ~180k articles with their headlines and summary collected from Hindi News Websites.
This is a first of its kind Dataset in Hindi which can be used to benchmark models for Text summarization in Hindi. This does not contain articles contained in Hindi Text Short Summarization Corpus which is being released parallely with this Dataset.
The dataset retains original punctuation, numbers etc in the articles.
### Languages
The language is Hindi.
### Licensing Information
MIT
### Citation Information
https://www.kaggle.com/datasets/disisbig/hindi-text-short-and-large-summarization-corpus?select=test.csv
### Contributions
|
4eJIoBek/Old-audios-11k | ---
license: unknown
---
unsorted audios in mod, wav or other old audio formats |
PurCL/marinda-type-inference-debuginfo-only-O1-shuffle | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: metadata
struct:
- name: binary_name
dtype: string
- name: function_addr
dtype: int64
- name: function_name
dtype: string
- name: project_name
dtype: string
- name: code_w_type
dtype: string
- name: code
dtype: string
- name: data_dep
dtype: string
splits:
- name: train
num_bytes: 201535867.70075417
num_examples: 37113
- name: test
num_bytes: 22394684.299245823
num_examples: 4124
download_size: 52386440
dataset_size: 223930552.0
---
# Dataset Card for "marinda-type-inference-debuginfo-only-O1-shuffle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ajibawa-2023__Code-Mistral-7B | ---
pretty_name: Evaluation run of ajibawa-2023/Code-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ajibawa-2023/Code-Mistral-7B](https://huggingface.co/ajibawa-2023/Code-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ajibawa-2023__Code-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-25T07:53:45.933606](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-Mistral-7B/blob/main/results_2024-03-25T07-53-45.933606.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527492220016492,\n\
\ \"acc_stderr\": 0.031870603059274874,\n \"acc_norm\": 0.6533709217123561,\n\
\ \"acc_norm_stderr\": 0.0325259522142822,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5463721037747236,\n\
\ \"mc2_stderr\": 0.015046435516843176\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6083617747440273,\n \"acc_stderr\": 0.014264122124938213,\n\
\ \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.01397545412275656\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6560446126269668,\n\
\ \"acc_stderr\": 0.0047405557821421735,\n \"acc_norm\": 0.8529177454690301,\n\
\ \"acc_norm_stderr\": 0.0035346403488166773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.02749566368372406,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.02749566368372406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7219730941704036,\n\
\ \"acc_stderr\": 0.030069584874494043,\n \"acc_norm\": 0.7219730941704036,\n\
\ \"acc_norm_stderr\": 0.030069584874494043\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323786,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323786\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3195530726256983,\n\
\ \"acc_stderr\": 0.01559552029414741,\n \"acc_norm\": 0.3195530726256983,\n\
\ \"acc_norm_stderr\": 0.01559552029414741\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.47783572359843546,\n \"acc_stderr\": 0.012757683047716175,\n\
\ \"acc_norm\": 0.47783572359843546,\n \"acc_norm_stderr\": 0.012757683047716175\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n \"\
acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5463721037747236,\n\
\ \"mc2_stderr\": 0.015046435516843176\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8224151539068666,\n \"acc_stderr\": 0.010740676861359226\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6808188021228203,\n \
\ \"acc_stderr\": 0.012840345676251648\n }\n}\n```"
repo_url: https://huggingface.co/ajibawa-2023/Code-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|arc:challenge|25_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|arc:challenge|25_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|gsm8k|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|gsm8k|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hellaswag|10_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hellaswag|10_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T05-45-49.471582.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T07-53-45.933606.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T07-53-45.933606.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- '**/details_harness|winogrande|5_2024-03-25T05-45-49.471582.parquet'
- split: 2024_03_25T07_53_45.933606
path:
- '**/details_harness|winogrande|5_2024-03-25T07-53-45.933606.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-25T07-53-45.933606.parquet'
- config_name: results
data_files:
- split: 2024_03_25T05_45_49.471582
path:
- results_2024-03-25T05-45-49.471582.parquet
- split: 2024_03_25T07_53_45.933606
path:
- results_2024-03-25T07-53-45.933606.parquet
- split: latest
path:
- results_2024-03-25T07-53-45.933606.parquet
---
# Dataset Card for Evaluation run of ajibawa-2023/Code-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ajibawa-2023/Code-Mistral-7B](https://huggingface.co/ajibawa-2023/Code-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ajibawa-2023__Code-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-25T07:53:45.933606](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-Mistral-7B/blob/main/results_2024-03-25T07-53-45.933606.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527492220016492,
"acc_stderr": 0.031870603059274874,
"acc_norm": 0.6533709217123561,
"acc_norm_stderr": 0.0325259522142822,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5463721037747236,
"mc2_stderr": 0.015046435516843176
},
"harness|arc:challenge|25": {
"acc": 0.6083617747440273,
"acc_stderr": 0.014264122124938213,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.01397545412275656
},
"harness|hellaswag|10": {
"acc": 0.6560446126269668,
"acc_stderr": 0.0047405557821421735,
"acc_norm": 0.8529177454690301,
"acc_norm_stderr": 0.0035346403488166773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952929,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952929
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.02749566368372406,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.02749566368372406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7219730941704036,
"acc_stderr": 0.030069584874494043,
"acc_norm": 0.7219730941704036,
"acc_norm_stderr": 0.030069584874494043
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323786,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323786
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3195530726256983,
"acc_stderr": 0.01559552029414741,
"acc_norm": 0.3195530726256983,
"acc_norm_stderr": 0.01559552029414741
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47783572359843546,
"acc_stderr": 0.012757683047716175,
"acc_norm": 0.47783572359843546,
"acc_norm_stderr": 0.012757683047716175
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5463721037747236,
"mc2_stderr": 0.015046435516843176
},
"harness|winogrande|5": {
"acc": 0.8224151539068666,
"acc_stderr": 0.010740676861359226
},
"harness|gsm8k|5": {
"acc": 0.6808188021228203,
"acc_stderr": 0.012840345676251648
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
KETI-AIR/kor_ropes | ---
pretty_name: ROPES
language:
- ko
license:
- cc-by-4.0
size_categories:
- 10K<n<100K
task_categories:
- question-answering
task_ids:
- extractive-qa
dataset_info:
features:
- name: data_index_by_user
dtype: int32
- name: background
dtype: string
- name: situation
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
splits:
- name: train
num_bytes: 13608462
num_examples: 10924
- name: validation
num_bytes: 1864822
num_examples: 1688
- name: test
num_bytes: 2158508
num_examples: 1710
download_size: 1465973
dataset_size: 17631792
---
# Dataset Card for ROPES
## Licensing Information
The data is distributed under the [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/) license.
## Source Data Citation INformation
```
@inproceedings{Lin2019ReasoningOP,
title={Reasoning Over Paragraph Effects in Situations},
author={Kevin Lin and Oyvind Tafjord and Peter Clark and Matt Gardner},
booktitle={MRQA@EMNLP},
year={2019}
} |
aditijha/instruct_v1_1k_and_lima | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 3698244
num_examples: 2000
download_size: 2042056
dataset_size: 3698244
---
# Dataset Card for "instruct_v1_1k_and_lima"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arunrajuamrutha3/martin_valen_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 82739.0
num_examples: 10
download_size: 82646
dataset_size: 82739.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gowitheflow/wiki-span | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: train
num_bytes: 14498836027
num_examples: 6458670
download_size: 8956015300
dataset_size: 14498836027
---
# Dataset Card for "wiki-span"
This dataset is constructed by sampling 25%-50% of each wikipedia record twice, as positive pairs. It can be used to train unsupervised sentence representation models. |
Seanxh/twitter_dataset_1713212801 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 176063
num_examples: 412
download_size: 62997
dataset_size: 176063
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nlpso/m2m3_fine_tuning_ocr_ptrn_cmbert_iob2 | ---
language:
- fr
multilinguality:
- monolingual
task_categories:
- token-classification
---
# m2m3_fine_tuning_ocr_ptrn_cmbert_iob2
## Introduction
This dataset was used to fine-tuned [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained) for **nested NER task** using Independant NER layers approach [M1].
It contains Paris trade directories entries from the 19th century.
## Dataset parameters
* Approachrd : M2 and M3
* Dataset type : noisy (Pero OCR)
* Tokenizer : [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained)
* Tagging format : IOB2
* Counts :
* Train : 6084
* Dev : 676
* Test : 1685
* Associated fine-tuned models :
* M2 : [nlpso/m2_joint_label_ocr_ptrn_cmbert_iob2](https://huggingface.co/nlpso/m2_joint_label_ocr_ptrn_cmbert_iob2)
* M3 : [nlpso/m3_hierarchical_ner_ocr_ptrn_cmbert_iob2](https://huggingface.co/nlpso/m3_hierarchical_ner_ocr_ptrn_cmbert_iob2)
## Entity types
Abbreviation|Entity group (level)|Description
-|-|-
O |1 & 2|Outside of a named entity
PER |1|Person or company name
ACT |1 & 2|Person or company professional activity
TITREH |2|Military or civil distinction
DESC |1|Entry full description
TITREP |2|Professionnal reward
SPAT |1|Address
LOC |2|Street name
CARDINAL |2|Street number
FT |2|Geographical feature
## How to use this dataset
```python
from datasets import load_dataset
train_dev_test = load_dataset("nlpso/m2m3_fine_tuning_ocr_ptrn_cmbert_iob2")
|
Fsoft-AIC/the-vault-function | ---
language:
- code
- en
multilinguality:
- multiprogramming languages
task_categories:
- text-generation
license: mit
dataset_info:
features:
- name: identifier
dtype: string
- name: return_type
dtype: string
- name: repo
dtype: string
- name: path
dtype: string
- name: language
dtype: string
- name: code
dtype: string
- name: code_tokens
dtype: string
- name: original_docstring
dtype: string
- name: comment
dtype: string
- name: docstring_tokens
dtype: string
- name: docstring
dtype: string
- name: original_string
dtype: string
pretty_name: The Vault Function
viewer: true
---
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Statistics](#dataset-statistics)
- [Usage](#usage)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [FSoft-AI4Code/TheVault](https://github.com/FSoft-AI4Code/TheVault)
- **Paper:** [The Vault: A Comprehensive Multilingual Dataset for Advancing Code Understanding and Generation](https://arxiv.org/abs/2305.06156)
- **Contact:** support.ailab@fpt.com
- **Website:** https://www.fpt-aicenter.com/ai-residency/
<p align="center">
<img src="https://raw.githubusercontent.com/FSoft-AI4Code/TheVault/main/assets/the-vault-4-logo-png.png" width="300px" alt="logo">
</p>
<div align="center">
# The Vault: A Comprehensive Multilingual Dataset for Advancing Code Understanding and Generation
</div>
## Dataset Summary
The Vault dataset is a comprehensive, large-scale, multilingual parallel dataset that features high-quality code-text pairs derived from The Stack, the largest permissively-licensed source code dataset.
We provide The Vault which contains code snippets from 10 popular programming languages such as Java, JavaScript, Python, Ruby, Rust, Golang, C#, C++, C, and PHP. This dataset provides multiple code-snippet levels, metadata, and 11 docstring styles for enhanced usability and versatility.
## Supported Tasks
The Vault can be used for pretraining LLMs or downstream code-text interaction tasks. A number of tasks related to code understanding and geneartion can be constructed using The Vault such as *code summarization*, *text-to-code generation* and *code search*.
## Languages
The natural language text (docstring) is in English.
10 programming languages are supported in The Vault: `Python`, `Java`, `JavaScript`, `PHP`, `C`, `C#`, `C++`, `Go`, `Ruby`, `Rust`
## Dataset Structure
### Data Instances
```
{
"hexsha": "5c47f0b4c173a8fd03e4e633d9b3dd8211e67ad0",
"repo": "neumanna94/beepboop",
"path": "js/scripts.js",
"license": [
"MIT"
],
"language": "JavaScript",
"identifier": "beepBoopSelector",
"return_type": "<not_specific>",
"original_string": "function beepBoopSelector(inputString, bbFunction){\n if(bbFunction==1){\n return beepBoop(inputString);\n } else if(bbFunction==2){\n return beepBoop2(inputString);\n } else if(bbFunction==3){\n return beepBoop3(inputString);\n } else {\n }\n}",
"original_docstring": "//Determines what beepBoop function to use",
"docstring": "Determines what beepBoop function to use",
"docstring_tokens": [
"Determines",
"what",
"beepBoop",
"function",
"to",
"use"
],
"code": "function beepBoopSelector(inputString, bbFunction){\n if(bbFunction==1){\n return beepBoop(inputString);\n } else if(bbFunction==2){\n return beepBoop2(inputString);\n } else if(bbFunction==3){\n return beepBoop3(inputString);\n } else {\n }\n}",
"code_tokens": [
"function",
"beepBoopSelector",
"(",
"inputString",
",",
"bbFunction",
")",
"{",
"if",
"(",
"bbFunction",
"==",
"1",
")",
"{",
"return",
"beepBoop",
"(",
"inputString",
")",
";",
"}",
"else",
"if",
"(",
"bbFunction",
"==",
"2",
")",
"{",
"return",
"beepBoop2",
"(",
"inputString",
")",
";",
"}",
"else",
"if",
"(",
"bbFunction",
"==",
"3",
")",
"{",
"return",
"beepBoop3",
"(",
"inputString",
")",
";",
"}",
"else",
"{",
"}",
"}"
],
"short_docstring": "Determines what beepBoop function to use",
"short_docstring_tokens": [
"Determines",
"what",
"beepBoop",
"function",
"to",
"use"
],
"comment": [],
"parameters": [
{
"param": "inputString",
"type": null
},
{
"param": "bbFunction",
"type": null
}
],
"docstring_params": {
"returns": [],
"raises": [],
"params": [
{
"identifier": "inputString",
"type": null,
"docstring": null,
"docstring_tokens": [],
"default": null,
"is_optional": null
},
{
"identifier": "bbFunction",
"type": null,
"docstring": null,
"docstring_tokens": [],
"default": null,
"is_optional": null
}
],
"outlier_params": [],
"others": []
}
}
```
### Data Fields
Data fields for function level:
- **hexsha** (string): the unique git hash of file
- **repo** (string): the owner/repo
- **path** (string): the full path to the original file
- **license** (list): licenses in the repo
- **language** (string): the programming language
- **identifier** (string): the function or method name
- **return_type** (string): the type returned by the function
- **original_string** (string): original version of function/class node
- **original_docstring** (string): the raw string before tokenization or parsing
- **code** (string): the part of the original that is code
- **code_tokens** (list): tokenized version of `code`
- **short_docstring** (string): short, brief summarization (first line of the docstring)
- **short_docstring_tokens** (list): tokenized version of `short_docstring
- **docstring** (string): the top-level comment or docstring (docstring version without param’s doc, return, exception fields, etc)
- **docstring_tokens** (list): tokenized version of docstring
- **comment** (list): list of comments (line) inside the function/class
- **parameters** (list): List of parameters and its type (type can be None)
- **docstring_params** (dict): Dictionary of the parsed information from docstring
See [here](https://github.com/FSoft-AI4Code/TheVault/blob/main/data/README.md) for more details and examples.
### Data Splits
In this repo, The Vault is divided into 5 subsets, where three training versions are split based on size of the full training set, and the remains are validation set and test set (approximate 20,000 samples in each). The statistic for languages in each split set is illustrated in the following section.
Before split, the dataset is deduplicated. There are 3 versions of training set that are small (5%), medium (20%) and large (100%).
## Dataset Statistics
- Compare to other benchmarks
| Dataset | #Language | #Code-text pair |
|:--------------------------|----------:|-----------------:|
| PyMT5 | 1 | ≈ 7,700,000 |
| CoDesc | 1 | 4,211,516 |
| CodeSearchNet | 6 | 2,326,976 |
| CodeSearchNet (CodeXGLUE) | 6 | 1,005,474 |
| Deepcom | 1 | 424,028 |
| CONCODE | 1 | 2,184,310 |
| Funcom | 1 | 2,149,121 |
| CodeT5 | 8 | 3,158,313 |
| **The Vault** | **10** | **34,098,775** |
- Statistic for split sets
| | train/small | train/medium | train/full | validation | test | total |
|:-----------|------------:|-------------:|-----------:|-----------:|-------:|--------------:|
|Python | 370,657 | 1,952,110 | 7,772,647 | 30,992 | 21,652 | 7,825,291 |
|Java | 351,213 | 1,612,366 | 6,629,193 | 22,677 | 15,552 | 6,667,422 |
|JavaScript | 82,931 | 404,729 | 1,640,416 | 22,044 | 21,108 | 1,683,568 |
|PHP | 236,638 | 1,155,476 | 4,656,371 | 21,375 | 19,010 | 4,696,756 |
|C | 105,978 | 381,207 | 1,639,319 | 27,525 | 19,122 | 1,685,966 |
|C# | 141,090 | 783,166 | 3,305,891 | 24,787 | 19,638 | 3,350,316 |
|C++ | 87,420 | 410,907 | 1,671,268 | 20,011 | 18,169 | 1,709,448 |
|Go | 267,535 | 1,319,547 | 5,109,020 | 19,102 | 25,314 | 5,153,436 |
|Ruby | 23,921 | 112,574 | 424,339 | 17,338 | 19,908 | 461,585 |
|Rust | 35,367 | 224,015 | 825,130 | 16,716 | 23,141 | 864,987 |
|TOTAL | 1,702,750 | 8,356,097 |33,673,594 |222,567 |202,614 |**34,098,775** |
## Usage
You can load The Vault dataset using datasets library: ```pip install datasets```
```python
from datasets import load_dataset
# Load full function level dataset (34M samples)
dataset = load_dataset("Fsoft-AIC/the-vault-function")
# Load function level train/validation/test set
dataset = load_dataset("Fsoft-AIC/the-vault-function", split_set=["train"])
# Load "small" (or "medium", "full") version of function level training set
dataset = load_dataset("Fsoft-AIC/the-vault-function", split_set=["train/small"])
# specific language (e.g. Python)
dataset = load_dataset("Fsoft-AIC/the-vault-function", split_set=["train"], languages=['Python'])
# dataset streaming
data = load_dataset("Fsoft-AIC/the-vault-function", split_set= ["train"], streaming= True)
for sample in iter(data['train']):
print(sample)
```
A back up dataset can be downloaded in azure storage. See [Download The Vault from Azure blob storage](https://github.com/FSoft-AI4Code/TheVault#download-via-link).
## Additional information
### Licensing Information
MIT License
### Citation Information
```
@article{manh2023vault,
title={The Vault: A Comprehensive Multilingual Dataset for Advancing Code Understanding and Generation},
author={Manh, Dung Nguyen and Hai, Nam Le and Dau, Anh TV and Nguyen, Anh Minh and Nghiem, Khanh and Guo, Jin and Bui, Nghi DQ},
journal={arXiv preprint arXiv:2305.06156},
year={2023}
}
```
### Contributions
This dataset is developed by [FSOFT AI4Code team](https://github.com/FSoft-AI4Code). |
ahishamm/isic_binary_augmented | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': benign
'1': malignant
splits:
- name: train
num_bytes: 97831304.566
num_examples: 17214
- name: test
num_bytes: 44333792.176
num_examples: 7804
download_size: 152665521
dataset_size: 142165096.74199998
---
# Dataset Card for "isic_binary_augmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pkufool/libriheavy | ---
license: apache-2.0
---
# Libriheavy: a 50,000 hours ASR corpus with punctuation casing and context
Libriheavy is a labeled version of [Librilight](https://github.com/facebookresearch/libri-light), read our [paper](https://arxiv.org/abs/2309.08105) for more details.
See https://github.com/k2-fsa/libriheavy for more details.
## Citation
```
@misc{kang2023libriheavy,
title={Libriheavy: a 50,000 hours ASR corpus with punctuation casing and context},
author={Wei Kang and Xiaoyu Yang and Zengwei Yao and Fangjun Kuang and Yifan Yang and Liyong Guo and Long Lin and Daniel Povey},
year={2023},
eprint={2309.08105},
archivePrefix={arXiv},
primaryClass={eess.AS}
}
```
|
arthurneuron/USDC-WETH-Uniswap-V3-2021-to-2023 | ---
license: mit
---
|
qgiaohc/twitter_dataset_1713183929 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 24781
num_examples: 55
download_size: 13092
dataset_size: 24781
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dim/SlimOrcaRU | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: value_ru
dtype: string
- name: weight
dtype: float64
- name: key
dtype: int64
splits:
- name: train
num_bytes: 183635644
num_examples: 47536
download_size: 83293621
dataset_size: 183635644
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "SlimOrcaRU"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BitTranslate/indonesiantest | ---
license: cc0-1.0
---
|
ricardo-lsantos/my_cool_dataset | ---
license: mit
language:
- pt
pretty_name: My Cool Dataset
--- |
Atipico1/nq_test_adversary | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: gpt_answer_sentence
dtype: string
- name: gpt_adv_sentence
dtype: string
- name: is_valid_sentence
dtype: bool
- name: gpt_adv_passage
dtype: string
- name: is_valid_passage
dtype: bool
splits:
- name: train
num_bytes: 14371495
num_examples: 3610
download_size: 8513525
dataset_size: 14371495
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alvations/c4p0-v1-de-en | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: target_backto_source
dtype: string
- name: raw_target
list:
- name: generated_text
dtype: string
- name: raw_target_backto_source
list:
- name: generated_text
dtype: string
- name: prompt
dtype: string
- name: reverse_prompt
dtype: string
- name: source_langid
dtype: string
- name: target_langid
dtype: string
- name: target_backto_source_langid
dtype: string
- name: doc_id
dtype: int64
- name: sent_id
dtype: int64
- name: timestamp
dtype: string
- name: url
dtype: string
- name: doc_hash
dtype: string
- name: dataset
dtype: string
- name: source_lang
dtype: string
- name: target_lang
dtype: string
splits:
- name: train
num_bytes: 18411738
num_examples: 15146
download_size: 7768565
dataset_size: 18411738
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LNTANOooo/tulu_v3 | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: science.scierc_ner
num_bytes: 634623.0
num_examples: 349
- name: sharegpt
num_bytes: 776319873.4338813
num_examples: 72413
- name: science.scifact_json
num_bytes: 2350372.0
num_examples: 919
- name: lima
num_bytes: 2815967.0
num_examples: 1012
- name: gpt4_alpaca
num_bytes: 16091564.0
num_examples: 19834
- name: science.evidence_inference
num_bytes: 6620099.0
num_examples: 1673
- name: oasst1
num_bytes: 11027612.499452954
num_examples: 7046
- name: science.scitldr_aic
num_bytes: 13392412.0
num_examples: 1957
- name: science.scierc_relation
num_bytes: 735295.0
num_examples: 349
- name: science.qasper_truncated_4000
num_bytes: 34952831.0
num_examples: 2204
- name: hard_coded
num_bytes: 44940.0
num_examples: 90
- name: code_alpaca
num_bytes: 7102581.0
num_examples: 19992
- name: cot
num_bytes: 56091350.817187
num_examples: 49709
- name: wizardlm
num_bytes: 69442958.16317087
num_examples: 29597
- name: open_orca
num_bytes: 52677835.20356853
num_examples: 29581
- name: flan_v2
num_bytes: 105654005.53780366
num_examples: 49108
download_size: 518036574
dataset_size: 1155954319.6550643
configs:
- config_name: default
data_files:
- split: science.scierc_ner
path: data/science.scierc_ner-*
- split: sharegpt
path: data/sharegpt-*
- split: science.scifact_json
path: data/science.scifact_json-*
- split: lima
path: data/lima-*
- split: gpt4_alpaca
path: data/gpt4_alpaca-*
- split: science.evidence_inference
path: data/science.evidence_inference-*
- split: oasst1
path: data/oasst1-*
- split: science.scitldr_aic
path: data/science.scitldr_aic-*
- split: science.scierc_relation
path: data/science.scierc_relation-*
- split: science.qasper_truncated_4000
path: data/science.qasper_truncated_4000-*
- split: hard_coded
path: data/hard_coded-*
- split: code_alpaca
path: data/code_alpaca-*
- split: cot
path: data/cot-*
- split: wizardlm
path: data/wizardlm-*
- split: open_orca
path: data/open_orca-*
- split: flan_v2
path: data/flan_v2-*
---
|
mozci/tinysketch | ---
license: cc-by-nc-sa-4.0
language:
- en
language_creators:
- machine-generated
multilinguality:
- monolingual
pretty_name: 'Sketch Scene Descriptions'
size_categories:
- n<10K
source_datasets:
- FS-COCO
tags: []
task_categories:
- text-to-image
task_ids: []
---
# Dataset Card for Sketch Scene Descriptions
_Dataset used to train [Sketch Scene text to image model]()_
We advance sketch research to scenes with the first dataset of freehand scene sketches, FS-COCO. With practical applications in mind, we collect sketches that convey well scene content but can be sketched within a few minutes by a person with any sketching skills. Our dataset comprises around 10,000 freehand scene vector sketches with per-point space-time information by 100 non-expert individuals, offering both object- and scene-level abstraction. Each sketch is augmented with its text description.
For each row, the dataset contains `image` and `text` keys. `image` is a varying size PIL jpeg, and `text` is the accompanying text caption. Only a train split is provided.
## Citation
If you use this dataset, please cite it as:
```
@inproceedings{fscoco,
title={FS-COCO: Towards Understanding of Freehand Sketches of Common Objects in Context.}
author={Chowdhury, Pinaki Nath and Sain, Aneeshan and Bhunia, Ayan Kumar and Xiang, Tao and Gryaditskaya, Yulia and Song, Yi-Zhe},
booktitle={ECCV},
year={2022}
}
``` |
mugithi/ubuntu_question_answer_jsonl | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 2073677
num_examples: 12100
- name: test
num_bytes: 882250
num_examples: 5186
download_size: 0
dataset_size: 2955927
---
# Dataset Card for "ubuntu_question_answer_jsonl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
multi_re_qa | ---
annotations_creators:
- expert-generated
- found
language_creators:
- expert-generated
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
- 10K<n<100K
- 1K<n<10K
- 1M<n<10M
source_datasets:
- extended|other-BioASQ
- extended|other-DuoRC
- extended|other-HotpotQA
- extended|other-Natural-Questions
- extended|other-Relation-Extraction
- extended|other-SQuAD
- extended|other-SearchQA
- extended|other-TextbookQA
- extended|other-TriviaQA
task_categories:
- question-answering
task_ids:
- extractive-qa
- open-domain-qa
paperswithcode_id: multireqa
pretty_name: MultiReQA
dataset_info:
- config_name: SearchQA
features:
- name: candidate_id
dtype: string
- name: response_start
dtype: int32
- name: response_end
dtype: int32
splits:
- name: train
num_bytes: 183902877
num_examples: 3163801
- name: validation
num_bytes: 26439174
num_examples: 454836
download_size: 36991959
dataset_size: 210342051
- config_name: TriviaQA
features:
- name: candidate_id
dtype: string
- name: response_start
dtype: int32
- name: response_end
dtype: int32
splits:
- name: train
num_bytes: 107326326
num_examples: 1893674
- name: validation
num_bytes: 13508062
num_examples: 238339
download_size: 21750402
dataset_size: 120834388
- config_name: HotpotQA
features:
- name: candidate_id
dtype: string
- name: response_start
dtype: int32
- name: response_end
dtype: int32
splits:
- name: train
num_bytes: 29516866
num_examples: 508879
- name: validation
num_bytes: 3027229
num_examples: 52191
download_size: 6343389
dataset_size: 32544095
- config_name: SQuAD
features:
- name: candidate_id
dtype: string
- name: response_start
dtype: int32
- name: response_end
dtype: int32
splits:
- name: train
num_bytes: 16828974
num_examples: 95659
- name: validation
num_bytes: 2012997
num_examples: 10642
download_size: 3003646
dataset_size: 18841971
- config_name: NaturalQuestions
features:
- name: candidate_id
dtype: string
- name: response_start
dtype: int32
- name: response_end
dtype: int32
splits:
- name: train
num_bytes: 28732767
num_examples: 448355
- name: validation
num_bytes: 1418124
num_examples: 22118
download_size: 6124487
dataset_size: 30150891
- config_name: BioASQ
features:
- name: candidate_id
dtype: string
- name: response_start
dtype: int32
- name: response_end
dtype: int32
splits:
- name: test
num_bytes: 766190
num_examples: 14158
download_size: 156649
dataset_size: 766190
- config_name: RelationExtraction
features:
- name: candidate_id
dtype: string
- name: response_start
dtype: int32
- name: response_end
dtype: int32
splits:
- name: test
num_bytes: 217870
num_examples: 3301
download_size: 73019
dataset_size: 217870
- config_name: TextbookQA
features:
- name: candidate_id
dtype: string
- name: response_start
dtype: int32
- name: response_end
dtype: int32
splits:
- name: test
num_bytes: 4182675
num_examples: 71147
download_size: 704602
dataset_size: 4182675
- config_name: DuoRC
features:
- name: candidate_id
dtype: string
- name: response_start
dtype: int32
- name: response_end
dtype: int32
splits:
- name: test
num_bytes: 1483518
num_examples: 5525
download_size: 97625
dataset_size: 1483518
config_names:
- BioASQ
- DuoRC
- HotpotQA
- NaturalQuestions
- RelationExtraction
- SQuAD
- SearchQA
- TextbookQA
- TriviaQA
---
# Dataset Card for MultiReQA
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/google-research-datasets/MultiReQA
- **Repository:** https://github.com/google-research-datasets/MultiReQA
- **Paper:** https://arxiv.org/pdf/2005.02507.pdf
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
MultiReQA contains the sentence boundary annotation from eight publicly available QA datasets including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, BioASQ, RelationExtraction, and TextbookQA. Five of these datasets, including SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, contain both training and test data, and three, in cluding BioASQ, RelationExtraction, TextbookQA, contain only the test data (also includes DuoRC but not specified in the official documentation)
### Supported Tasks and Leaderboards
- Question answering (QA)
- Retrieval question answering (ReQA)
### Languages
Sentence boundary annotation for SearchQA, TriviaQA, HotpotQA, NaturalQuestions, SQuAD, BioASQ, RelationExtraction, TextbookQA and DuoRC
## Dataset Structure
### Data Instances
The general format is:
`
{
"candidate_id": <candidate_id>,
"response_start": <response_start>,
"response_end": <response_end>
}
...
`
An example from SearchQA:
`{'candidate_id': 'SearchQA_000077f3912049dfb4511db271697bad/_0_1',
'response_end': 306,
'response_start': 243} `
### Data Fields
`
{
"candidate_id": <STRING>,
"response_start": <INT>,
"response_end": <INT>
}
...
`
- **candidate_id:** The candidate id of the candidate sentence. It consists of the original qid from the MRQA shared task.
- **response_start:** The start index of the sentence with respect to its original context.
- **response_end:** The end index of the sentence with respect to its original context
### Data Splits
Train and Dev splits are available only for the following datasets,
- SearchQA
- TriviaQA
- HotpotQA
- SQuAD
- NaturalQuestions
Test splits are available only for the following datasets,
- BioASQ
- RelationExtraction
- TextbookQA
The number of candidate sentences for each dataset in the table below.
| | MultiReQA | |
|--------------------|-----------|---------|
| | train | test |
| SearchQA | 629,160 | 454,836 |
| TriviaQA | 335,659 | 238,339 |
| HotpotQA | 104,973 | 52,191 |
| SQuAD | 87,133 | 10,642 |
| NaturalQuestions | 106,521 | 22,118 |
| BioASQ | - | 14,158 |
| RelationExtraction | - | 3,301 |
| TextbookQA | - | 3,701 |
## Dataset Creation
### Curation Rationale
MultiReQA is a new multi-domain ReQA evaluation suite composed of eight retrieval QA tasks drawn from publicly available QA datasets from the [MRQA shared task](https://mrqa.github.io/). The dataset was curated by converting existing QA datasets from [MRQA shared task](https://mrqa.github.io/) to the format of MultiReQA benchmark.
### Source Data
#### Initial Data Collection and Normalization
The Initial data collection was performed by converting existing QA datasets from MRQA shared task to the format of MultiReQA benchmark.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
The annotators/curators of the dataset are [mandyguo-xyguo](https://github.com/mandyguo-xyguo) and [mwurts4google](https://github.com/mwurts4google), the contributors of the official MultiReQA github repository
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
The annotators/curators of the dataset are [mandyguo-xyguo](https://github.com/mandyguo-xyguo) and [mwurts4google](https://github.com/mwurts4google), the contributors of the official MultiReQA github repository
### Licensing Information
[More Information Needed]
### Citation Information
```
@misc{m2020multireqa,
title={MultiReQA: A Cross-Domain Evaluation for Retrieval Question Answering Models},
author={Mandy Guo and Yinfei Yang and Daniel Cer and Qinlan Shen and Noah Constant},
year={2020},
eprint={2005.02507},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@Karthik-Bhaskar](https://github.com/Karthik-Bhaskar) for adding this dataset. |
open-llm-leaderboard/details_alnrg2arg__test_wanda_240109 | ---
pretty_name: Evaluation run of alnrg2arg/test_wanda_240109
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [alnrg2arg/test_wanda_240109](https://huggingface.co/alnrg2arg/test_wanda_240109)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__test_wanda_240109\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T17:19:19.094893](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test_wanda_240109/blob/main/results_2024-01-13T17-19-19.094893.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23401038489636644,\n\
\ \"acc_stderr\": 0.029968361313724278,\n \"acc_norm\": 0.23351347966222002,\n\
\ \"acc_norm_stderr\": 0.0307471687800331,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n \
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22525597269624573,\n\
\ \"acc_stderr\": 0.012207839995407305,\n \"acc_norm\": 0.2295221843003413,\n\
\ \"acc_norm_stderr\": 0.012288926760890797\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.25542720573590916,\n \"acc_stderr\": 0.004352098082984431,\n\
\ \"acc_norm\": 0.2526389165504879,\n \"acc_norm_stderr\": 0.004336375492801798\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.29605263157894735,\n\
\ \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.29605263157894735,\n\
\ \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.18,\n\
\ \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \
\ \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15763546798029557,\n \"acc_stderr\": 0.025639014131172404,\n \"\
acc_norm\": 0.15763546798029557,\n \"acc_norm_stderr\": 0.025639014131172404\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.17,\n \"acc_stderr\": 0.037752516806863715,\n \"acc_norm\"\
: 0.17,\n \"acc_norm_stderr\": 0.037752516806863715\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2892561983471074,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.22349936143039592,\n\
\ \"acc_stderr\": 0.014897235229450707,\n \"acc_norm\": 0.22349936143039592,\n\
\ \"acc_norm_stderr\": 0.014897235229450707\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.023868003262500114,\n\
\ \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.023868003262500114\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n\
\ \"acc_stderr\": 0.024619771956697165,\n \"acc_norm\": 0.2508038585209003,\n\
\ \"acc_norm_stderr\": 0.024619771956697165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"\
acc\": 0.4988161010260458,\n \"acc_stderr\": 0.014052446290529019\n },\n\
\ \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n\
\ }\n}\n```"
repo_url: https://huggingface.co/alnrg2arg/test_wanda_240109
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|arc:challenge|25_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|arc:challenge|25_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|gsm8k|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|gsm8k|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hellaswag|10_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hellaswag|10_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-14-43.764095.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-19-19.094893.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T17-19-19.094893.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- '**/details_harness|winogrande|5_2024-01-13T17-14-43.764095.parquet'
- split: 2024_01_13T17_19_19.094893
path:
- '**/details_harness|winogrande|5_2024-01-13T17-19-19.094893.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T17-19-19.094893.parquet'
- config_name: results
data_files:
- split: 2024_01_13T17_14_43.764095
path:
- results_2024-01-13T17-14-43.764095.parquet
- split: 2024_01_13T17_19_19.094893
path:
- results_2024-01-13T17-19-19.094893.parquet
- split: latest
path:
- results_2024-01-13T17-19-19.094893.parquet
---
# Dataset Card for Evaluation run of alnrg2arg/test_wanda_240109
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/test_wanda_240109](https://huggingface.co/alnrg2arg/test_wanda_240109) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__test_wanda_240109",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:19:19.094893](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test_wanda_240109/blob/main/results_2024-01-13T17-19-19.094893.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23401038489636644,
"acc_stderr": 0.029968361313724278,
"acc_norm": 0.23351347966222002,
"acc_norm_stderr": 0.0307471687800331,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22525597269624573,
"acc_stderr": 0.012207839995407305,
"acc_norm": 0.2295221843003413,
"acc_norm_stderr": 0.012288926760890797
},
"harness|hellaswag|10": {
"acc": 0.25542720573590916,
"acc_stderr": 0.004352098082984431,
"acc_norm": 0.2526389165504879,
"acc_norm_stderr": 0.004336375492801798
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15763546798029557,
"acc_stderr": 0.025639014131172404,
"acc_norm": 0.15763546798029557,
"acc_norm_stderr": 0.025639014131172404
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.037752516806863715,
"acc_norm": 0.17,
"acc_norm_stderr": 0.037752516806863715
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2892561983471074,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.2892561983471074,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.22349936143039592,
"acc_stderr": 0.014897235229450707,
"acc_norm": 0.22349936143039592,
"acc_norm_stderr": 0.014897235229450707
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26878612716763006,
"acc_stderr": 0.023868003262500114,
"acc_norm": 0.26878612716763006,
"acc_norm_stderr": 0.023868003262500114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.024619771956697165,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.024619771956697165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4988161010260458,
"acc_stderr": 0.014052446290529019
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
aditijha/instruct_v3_10k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 39309622.55416882
num_examples: 10000
download_size: 23617961
dataset_size: 39309622.55416882
---
# Dataset Card for "instruct_v3_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlpso/m0_qualitative_analysis_ref_ptrn_cmbert_io | ---
language:
- fr
multilinguality:
- monolingual
task_categories:
- token-classification
---
# m0_qualitative_analysis_ref_ptrn_cmbert_io
## Introduction
This dataset was used to perform **qualitative analysis** of [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained) on **flat NER task** using Flat NER approach [M0].
It contains 19th-century Paris trade directories' entries.
## Dataset parameters
* Approach : M0
* Dataset type : ground-truth
* Tokenizer : [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained)
* Tagging format : IO
* Counts :
* Train : 6084
* Dev : 676
* Test : 1685
* Associated fine-tuned model : [nlpso/m0_flat_ner_ref_ptrn_cmbert_io](https://huggingface.co/nlpso/m0_flat_ner_ref_ptrn_cmbert_io)
## Entity types
Abbreviation|Description
-|-
O |Outside of a named entity
PER |Person or company name
ACT |Person or company professional activity
TITRE |Distinction
LOC |Street name
CARDINAL |Street number
FT |Geographical feature
## How to use this dataset
```python
from datasets import load_dataset
train_dev_test = load_dataset("nlpso/m0_qualitative_analysis_ref_ptrn_cmbert_io")
|
Maeda-miyazaki/dataset_750 | ---
license: cc-by-nc-3.0
---
|
chats-bug/red-pyjama-sample-1T-max-chunk-16k | ---
dataset_info:
features:
- name: text
dtype: string
- name: meta
dtype: string
splits:
- name: train
num_bytes: 5266104478.188356
num_examples: 924172
- name: test
num_bytes: 53198269.811643824
num_examples: 9336
download_size: 3092233105
dataset_size: 5319302748.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/marina_akizuki_onichichi | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Marina Akizuki
This is the dataset of Marina Akizuki, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 588 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 730 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 588 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 588 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 502 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 730 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 730 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
bengisucam/tr_dataset_combined | ---
license: apache-2.0
dataset_info:
features:
- name: Text
dtype: string
splits:
- name: train
num_bytes: 167603259
num_examples: 824809
download_size: 106342453
dataset_size: 167603259
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- tr
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
This dataset is the combination of the datasets listed below:
- BDas/Turkish-Dataset
- turkish_product_reviews
- winvoker/turkish-sentiment-analysis-dataset
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
arthurmluz/xlsum_data-xlsum_gptextsum_results | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
splits:
- name: validation
num_bytes: 26244213
num_examples: 7175
download_size: 15951725
dataset_size: 26244213
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "wikilingua_data-xlsum_gptextsum_results"
rouge= {'rouge1': 0.3230756314331615, 'rouge2': 0.12295752023585965, 'rougeL': 0.23099240967982115, 'rougeLsum': 0.23099240967982115}
bert= {'precision': 0.7382304361929877, 'recall': 0.7454116297765061, 'f1': 0.7414375136205958} |
lilacai/lilac-Capybara | ---
tags:
- Lilac
---
# lilac/Capybara
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/LDJnr/Capybara](https://huggingface.co/datasets/LDJnr/Capybara)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-Capybara
```
or from python with:
```py
ll.download("lilacai/lilac-Capybara")
```
|
one-sec-cv12/chunk_105 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 18310590720.0
num_examples: 190640
download_size: 16453314083
dataset_size: 18310590720.0
---
# Dataset Card for "chunk_105"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jaja7744/dolly-15k-cn | ---
license: apache-2.0
task_categories:
- text-generation
language:
- zh
pretty_name: d
size_categories:
- 10K<n<100K
--- |
irds/lotte_recreation_test_search | ---
pretty_name: '`lotte/recreation/test/search`'
viewer: false
source_datasets: ['irds/lotte_recreation_test']
task_categories:
- text-retrieval
---
# Dataset Card for `lotte/recreation/test/search`
The `lotte/recreation/test/search` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/lotte#lotte/recreation/test/search).
# Data
This dataset provides:
- `queries` (i.e., topics); count=924
- `qrels`: (relevance assessments); count=1,991
- For `docs`, use [`irds/lotte_recreation_test`](https://huggingface.co/datasets/irds/lotte_recreation_test)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/lotte_recreation_test_search', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/lotte_recreation_test_search', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Santhanam2021ColBERTv2,
title = "ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction",
author = "Keshav Santhanam and Omar Khattab and Jon Saad-Falcon and Christopher Potts and Matei Zaharia",
journal= "arXiv preprint arXiv:2112.01488",
year = "2021",
url = "https://arxiv.org/abs/2112.01488"
}
```
|
AliEdalat/Persian_ChatBot_dataset_Fine_Tuning_Alpaca_Model | ---
license: apache-2.0
task_categories:
- text-generation
- conversational
language:
- fa
size_categories:
- 1K<n<10K
---
# Persian_ChatBot_dataset_Fine_Tuning_Alpaca_Model
Persian ChatBot dataset, fine-tune LLaMa on instructed data (preprocessed alpaca dataset). [GitHub](https://github.com/AliEdalat/ChatBot_for_persian_LLaMA_fine_tune.git)
- we use [preprocessed alpaca dataset](https://github.com/thisserand/alpaca-lora-finetune-language.git) as a dataset. we translate no_translate data to persian with [mt5](https://huggingface.co/persiannlp/mt5-large-parsinlu-translation_en_fa). ([train dataset](https://huggingface.co/datasets/AliEdalat/Persian_ChatBot_dataset_Fine_Tuning_Alpaca_Model/tree/main) and [test data](https://huggingface.co/datasets/AliEdalat/Persian_ChatBot_dataset_Fine_Tuning_Alpaca_Model/tree/main) with 2k example is ready)
- we use LLaMA as a generative model for creating a chatbot model. we fine-tune the model with our Persian dataset and test it.
- for improving ChatBot performance, replace "برای اینکه این کار را بکنم" with "" |
Defetya/eval_open_llama_ru | ---
license: apache-2.0
---
|
SS3830/image-search-sa | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 28874455.946
num_examples: 2378
download_size: 24014632
dataset_size: 28874455.946
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_80_1713214788 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 1420630
num_examples: 3501
download_size: 719400
dataset_size: 1420630
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ikuldeep1/vehicle-damage-fraud-image-balanced | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 1716555477.626
num_examples: 12729
download_size: 1433374572
dataset_size: 1716555477.626
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-futin__feed-sen_vi-894567-2175669982 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: facebook/opt-2.7b
metrics: []
dataset_name: futin/feed
dataset_config: sen_vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-2.7b
* Dataset: futin/feed
* Config: sen_vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
chathuranga-jayanath/context-5-rhino-finmath-times4j-html-mavendoxia-wro4j-guava-supercsv-len-10000-prompt-1 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: filepath
dtype: string
- name: start_bug_line
dtype: int64
- name: end_bug_line
dtype: int64
- name: bug
dtype: string
- name: fix
dtype: string
- name: ctx
dtype: string
splits:
- name: train
num_bytes: 28232907
num_examples: 45517
- name: validation
num_bytes: 3535186
num_examples: 5689
- name: test
num_bytes: 3535341
num_examples: 5689
download_size: 14548547
dataset_size: 35303434
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
TecnicaLLM/dolly-15k | ---
license: cc-by-sa-3.0
---
|
projectbaraat/kannada-Mathematical | ---
dataset_info:
features:
- name: input
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 519631610
num_examples: 335690
download_size: 169797917
dataset_size: 519631610
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Yevhenii1234/test | ---
license: apache-2.0
---
|
presencesw/dataset_2000_complexquestion_3 | ---
dataset_info:
features:
- name: entities
sequence: 'null'
- name: triplets
sequence: 'null'
- name: answer
dtype: string
- name: complex_question
dtype: string
splits:
- name: train
num_bytes: 17911
num_examples: 200
download_size: 0
dataset_size: 17911
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dataset_2000_complexquestion_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Bluepelt/Idkwhattodomate | ---
license: mit
---
|
connorhoehn/trading_card_display_classification_1_5k_v3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': grid
'1': solo
'2': spread
'3': stack
splits:
- name: train
num_bytes: 1127230775.103
num_examples: 1249
- name: test
num_bytes: 155934991.0
num_examples: 307
download_size: 1317201819
dataset_size: 1283165766.103
---
# Dataset Card for "trading_card_display_classification_1_5k_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/v3_train_free_concat_20 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842535816
num_examples: 2500
download_size: 1826001643
dataset_size: 3842535816
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Falah/arabic_islamic_fashion_prompts_SDXL | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 239684574
num_examples: 1000000
download_size: 27020301
dataset_size: 239684574
---
# Dataset Card for "arabic_islamic_fashion_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Indic-Benchmark/nepali-arc-c-2.5k | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
struct:
- name: choices
list:
- name: label
dtype: string
- name: text
dtype: string
- name: stem
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 1841172
num_examples: 2584
download_size: 706909
dataset_size: 1841172
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dot-ammar/AR-dotted-mediumPlus | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: clean
dtype: string
splits:
- name: train
num_bytes: 387187864
num_examples: 1625508
download_size: 214233397
dataset_size: 387187864
---
# Dataset Card for "AR-dotted-mediumPlus-arrow"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thermostatic/parallel_corpus_webcrawl_english_spanish_1 | ---
license: cc-by-4.0
task_categories:
- translation
language:
- en
- es
tags:
- English
- Spanish
- Parallel corpus
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This parallel corpus dataset contains about 21k rows of parallel English and Spanish texts obtained by crawling different websites. It has been filtered strictly.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
This is a parallel corpus of bilingual texts crawled from multilingual websites, which contains 21, 005 TUs. A strict validation process has been followed, which resulted in discarding:
- TUs from crawled websites that do not comply to the PSI directive,
- TUs with more than 99% of mispelled tokens,
- TUs identified during the manual validation process and all the TUs from websites which error rate in the sample extracted for manual validation is strictly above the following thresholds: 50% of TUs with language identification errors, 50% of TUs with alignment errors, 50% of TUs with tokenization errors, 20% of TUs identified as machine translated content, 50% of TUs with translation errors.
- **Period of crawling:** 15/11/2016 - 23/01/2017 (DD/MM/YY).
- **Curated by:** Directorate-General for Communications Networks, Content and Technology.
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** English & Spanish
- **License:** cc-by-4.0
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** http://data.europa.eu/88u/dataset/elrc_339
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
This dataset is perfect for training Machine Translation algorithms.
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-project-0b0f26eb-7664951 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- lener_br
eval_info:
task: entity_extraction
model: Luciano/bertimbau-large-lener_br
metrics: []
dataset_name: lener_br
dataset_config: lener_br
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: Luciano/bertimbau-large-lener_br
* Dataset: lener_br
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
tasksource/sherliic | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 106784
num_examples: 996
- name: test
num_bytes: 322932
num_examples: 2989
download_size: 146406
dataset_size: 429716
language:
- en
---
# Dataset Card for "sherliic"
https://github.com/mnschmit/SherLIiC
```
@inproceedings{schmitt2019sherliic,
title = "{S}her{LI}i{C}: A Typed Event-Focused Lexical Inference Benchmark for Evaluating Natural Language Inference",
author = {Schmitt, Martin and
Sch{\"u}tze, Hinrich},
booktitle = "Proceedings of the 57th Conference of the Association for Computational Linguistics",
month = jul,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P19-1086",
pages = "902--914"
}
``` |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.