datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
ZelaAI/librispeech_clean_360_2048 | ---
dataset_info:
features:
- name: text_tokens
sequence: int64
- name: audio_tokens_1
sequence: int64
- name: audio_tokens_2
sequence: int64
splits:
- name: train
num_bytes: 2463335616
num_examples: 64598
download_size: 250715012
dataset_size: 2463335616
---
# Dataset Card for "librispeech_clean_360_2048"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVCODE/big5_essays_ru | ---
dataset_info:
features:
- name: '#AUTHID'
dtype: string
- name: TEXT
dtype: string
- name: extroversion
dtype: int64
- name: neuroticism
dtype: int64
- name: agreeableness
dtype: int64
- name: conscientiousness
dtype: int64
- name: openness
dtype: int64
- name: TEXT_RU
dtype: string
splits:
- name: train
num_bytes: 21522844
num_examples: 2467
download_size: 11117157
dataset_size: 21522844
---
# Dataset Card for "big5_essays_ru"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
laion/CLIP-ViT-H-14-laion2B-s32B-b79K-all-checkpoints | ---
license: mit
---
This repository contains the intermediate checkpoints for the model https://huggingface.co/laion/CLIP-ViT-H-14-laion2B-s32B-b79K.
Each "epoch" corresponds to an additional 32B/256 samples seen.
The purpose of releasing these checkpoints and optimizer states is to enable analysis.
For the first 121 "epcohs", training was done with float16 mixed precision before switching to bfloat16 after a loss blow up. |
svenschultze/test | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
splits:
- name: train
num_bytes: 47780
num_examples: 1000
download_size: 14311
dataset_size: 47780
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
camilaslz/josi | ---
license: openrail
---
|
ruliad/factual-expert-processed-v2_subsample_5pct | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 894780156.7510012
num_examples: 320390
download_size: 534418575
dataset_size: 894780156.7510012
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3 | ---
pretty_name: Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3](https://huggingface.co/NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T16:32:38.493956](https://huggingface.co/datasets/open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3/blob/main/results_2023-12-04T16-32-38.493956.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.613382583651225,\n\
\ \"acc_stderr\": 0.03275222858972369,\n \"acc_norm\": 0.6188052812995584,\n\
\ \"acc_norm_stderr\": 0.03341845889809012,\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.016355567611960397,\n \"mc2\": 0.48209769794385393,\n\
\ \"mc2_stderr\": 0.014975113215989893\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558902,\n\
\ \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.01428052266746732\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6404102768372834,\n\
\ \"acc_stderr\": 0.004788994060654277,\n \"acc_norm\": 0.8333997211710814,\n\
\ \"acc_norm_stderr\": 0.003718570792719566\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n\
\ \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n\
\ \"acc_stderr\": 0.025560604721022884,\n \"acc_norm\": 0.7193548387096774,\n\
\ \"acc_norm_stderr\": 0.025560604721022884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.02491524398598785,\n \
\ \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.02491524398598785\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608452,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608452\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295845,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295845\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640763,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640763\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935573,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935573\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.025469770149400175,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.025469770149400175\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3016759776536313,\n\
\ \"acc_stderr\": 0.015350767572220286,\n \"acc_norm\": 0.3016759776536313,\n\
\ \"acc_norm_stderr\": 0.015350767572220286\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n\
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868052,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868052\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45436766623207303,\n\
\ \"acc_stderr\": 0.012716941720734802,\n \"acc_norm\": 0.45436766623207303,\n\
\ \"acc_norm_stderr\": 0.012716941720734802\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.016355567611960397,\n \"mc2\": 0.48209769794385393,\n\
\ \"mc2_stderr\": 0.014975113215989893\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712669\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3616376042456406,\n \
\ \"acc_stderr\": 0.013234658351088774\n }\n}\n```"
repo_url: https://huggingface.co/NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-32-38.493956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-32-38.493956.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- '**/details_harness|winogrande|5_2023-12-04T16-32-38.493956.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T16-32-38.493956.parquet'
- config_name: results
data_files:
- split: 2023_12_04T16_32_38.493956
path:
- results_2023-12-04T16-32-38.493956.parquet
- split: latest
path:
- results_2023-12-04T16-32-38.493956.parquet
---
# Dataset Card for Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3](https://huggingface.co/NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T16:32:38.493956](https://huggingface.co/datasets/open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3/blob/main/results_2023-12-04T16-32-38.493956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.613382583651225,
"acc_stderr": 0.03275222858972369,
"acc_norm": 0.6188052812995584,
"acc_norm_stderr": 0.03341845889809012,
"mc1": 0.3219094247246022,
"mc1_stderr": 0.016355567611960397,
"mc2": 0.48209769794385393,
"mc2_stderr": 0.014975113215989893
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558902,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.01428052266746732
},
"harness|hellaswag|10": {
"acc": 0.6404102768372834,
"acc_stderr": 0.004788994060654277,
"acc_norm": 0.8333997211710814,
"acc_norm_stderr": 0.003718570792719566
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.02944517532819959,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.02944517532819959
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.025560604721022884,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.025560604721022884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.02491524398598785,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.02491524398598785
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608452,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608452
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295845,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295845
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640763,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640763
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935573,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935573
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.025469770149400175,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.025469770149400175
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3016759776536313,
"acc_stderr": 0.015350767572220286,
"acc_norm": 0.3016759776536313,
"acc_norm_stderr": 0.015350767572220286
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.027057974624494382,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.027057974624494382
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868052,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868052
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45436766623207303,
"acc_stderr": 0.012716941720734802,
"acc_norm": 0.45436766623207303,
"acc_norm_stderr": 0.012716941720734802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3219094247246022,
"mc1_stderr": 0.016355567611960397,
"mc2": 0.48209769794385393,
"mc2_stderr": 0.014975113215989893
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712669
},
"harness|gsm8k|5": {
"acc": 0.3616376042456406,
"acc_stderr": 0.013234658351088774
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/tamade_chiyu_bangdream | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tamade_chiyu/珠手ちゆ (BanG Dream!)
This is the dataset of tamade_chiyu/珠手ちゆ (BanG Dream!), containing 107 images and their tags.
The core tags of this character are `long_hair, blue_eyes, bangs, red_hair, ahoge, animal_ears, headphones, fake_animal_ears, animal_ear_headphones, cat_ear_headphones, hair_between_eyes, very_long_hair, v-shaped_eyebrows`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 107 | 140.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tamade_chiyu_bangdream/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 107 | 80.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tamade_chiyu_bangdream/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 250 | 171.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tamade_chiyu_bangdream/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 107 | 125.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tamade_chiyu_bangdream/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 250 | 247.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tamade_chiyu_bangdream/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tamade_chiyu_bangdream',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 49 |  |  |  |  |  | white_shirt, 1girl, long_sleeves, solo, looking_at_viewer, red_necktie, collared_shirt, striped_necktie, blazer, school_uniform, blue_skirt, plaid_skirt, blush, pleated_skirt, black_jacket, open_mouth, open_jacket, blue_jacket, smile, white_background, simple_background |
| 1 | 8 |  |  |  |  |  | 1girl, long_sleeves, solo, looking_at_viewer, black_choker, black_gloves, black_shorts, collarbone, fingerless_gloves, short_shorts, black_jacket, blush, grin, open_clothes, sidelocks, teeth, white_shirt, belt, holding, microphone, open_mouth, pink_hair, pointing, standing, upper_body, white_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | white_shirt | 1girl | long_sleeves | solo | looking_at_viewer | red_necktie | collared_shirt | striped_necktie | blazer | school_uniform | blue_skirt | plaid_skirt | blush | pleated_skirt | black_jacket | open_mouth | open_jacket | blue_jacket | smile | white_background | simple_background | black_choker | black_gloves | black_shorts | collarbone | fingerless_gloves | short_shorts | grin | open_clothes | sidelocks | teeth | belt | holding | microphone | pink_hair | pointing | standing | upper_body | white_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------|:--------|:---------------|:-------|:--------------------|:--------------|:-----------------|:------------------|:---------|:-----------------|:-------------|:--------------|:--------|:----------------|:---------------|:-------------|:--------------|:--------------|:--------|:-------------------|:--------------------|:---------------|:---------------|:---------------|:-------------|:--------------------|:---------------|:-------|:---------------|:------------|:--------|:-------|:----------|:-------------|:------------|:-----------|:-----------|:-------------|:---------------|
| 0 | 49 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | | | | | | | | X | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-53000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1116162
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EarthnDusk/Random_Prompts | ---
license: creativeml-openrail-m
---
|
jwl25b/final_project_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 3623271.0
num_examples: 50
download_size: 3619422
dataset_size: 3623271.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
IlyaGusev/librusec | ---
dataset_info:
features:
- name: id
dtype: uint64
- name: text
dtype: string
splits:
- name: train
num_bytes: 125126513109
num_examples: 223256
download_size: 34905399148
dataset_size: 125126513109
task_categories:
- text-generation
language:
- ru
size_categories:
- 100K<n<1M
---
# Librusec dataset
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Description](#description)
- [Usage](#usage)
## Description
**Summary:** Based on http://panchenko.me/data/russe/librusec_fb2.plain.gz. Uploaded here for convenience. Additional cleaning was performed.
**Script:** [create_librusec.py](https://github.com/IlyaGusev/rulm/blob/master/data_processing/create_librusec.py)
**Point of Contact:** [Ilya Gusev](ilya.gusev@phystech.edu)
**Languages:** Russian.
## Usage
Prerequisites:
```bash
pip install datasets zstandard jsonlines pysimdjson
```
Dataset iteration:
```python
from datasets import load_dataset
dataset = load_dataset('IlyaGusev/librusec', split="train", streaming=True)
for example in dataset:
print(example["text"])
``` |
alexandrainst/dacoref | ---
license: cc-by-sa-4.0
dataset_info:
features:
- name: sent_id
dtype: string
- name: text
dtype: string
- name: tokens
sequence: string
- name: clusters
sequence:
sequence: int64
splits:
- name: train
num_bytes: 871763
num_examples: 2686
- name: val
num_bytes: 103309
num_examples: 332
- name: test
num_bytes: 123444
num_examples: 385
download_size: 568857
dataset_size: 1098516
language:
- da
tags:
- coreference-resolution
pretty_name: DaCoref
size_categories:
- 1K<n<10K
---
# Dataset Card for DaCoref
## Dataset Description
- **Repository:** <https://gist.github.com/saattrupdan/3551300138e8668fbb8d32357e7b39f7>
- **Point of Contact:** [Dan Saattrup Nielsen](mailto:dan.nielsen@alexandra.dk)
- **Size of downloaded dataset files:** 569 KB
- **Size of the generated dataset:** 1099 KB
- **Total amount of disk used:** 1668 KB
### Dataset Summary
This dataset contains coreference annotations of part of the [Copenhagen Dependency Treebank](https://github.com/mbkromann/copenhagen-dependency-treebank/wiki/CDT).
### Supported Tasks and Leaderboards
This dataset is meant to train coreference resolution models.
### Languages
The dataset is available in Danish (`da`).
## Dataset Structure
### Data Instances
- **Size of downloaded dataset files:** 569 KB
- **Size of the generated dataset:** 1099 KB
- **Total amount of disk used:** 1668 KB
| split | samples |
|---------|--------:|
| train | 2,686 |
| val | 332 |
| test | 385 |
An example from the dataset looks as follows.
```
{
'sent_id': 'train-v2-0',
'doc_id': 'nw/0442',
'text': 'På fredag har SID inviteret til reception i SID-huset i anledning af at formanden Kjeld Christensen går ind i de glade tressere.',
'tokens': ['På',
'fredag',
'har',
'SID',
'inviteret',
'til',
'reception',
'i',
'SID-huset',
'i',
'anledning',
'af',
'at',
'formanden',
'Kjeld',
'Christensen',
'går',
'ind',
'i',
'de',
'glade',
'tressere',
'.'
],
'clusters': [[13, 14, 15]]
}
```
### Data Fields
The data fields are the same among all splits.
- `sent_id` (`string`): The sentence ID from the [Universal Dependencies]().
- `doc_id` (`string`): The document ID from the [Copenhagen Dependency Treebank]().
- `text` (`string`): The document.
- `tokens` (`list[str]`): The tokens appearing in the document.
- `clusters` (`list[list[int]]`): The coreference clusters in the document, where the integers refer to the indices in `tokens`.
## Dataset Creation
This coreference dataset was originally annotated as part of the Copenhagen Dependency Treebank (CDT) project but never finished. The incomplete annotation can be downloaded from the [project github](https://github.com/mbkromann/copenhagen-dependency-treebank).
The [CDT documentation](https://github.com/mbkromann/copenhagen-dependency-treebank/blob/master/manual/cdt-manual.pdf) contains description of the coreference classes as well as inter-annotator agreement and confusion matrices.
For this resource, we used the annotation files from the annotator "Lotte" along with the UD syntax which is an automatic conversion of the CDT syntax annotation by Johansen et al. (2015). We provide the sentence ID from the UD resource as well as the document ID from CDT. The document ID has been prepended with a two letter domain code compatible with the domain codes of the Ontonotes corpus. This is a manually mapping of the sources listed in the CDT. Only nw (newswire), mz (magazine), and bn (broadcast news) were present:
- 299 nw documents
- 41 mz documents
- 1 bn
For the CDT, only the core node of each span was annotated and one annotator manually propagated the label to the entire span. A few systematic errors were corrected in this process, the most important being that plural pronouns "we" and "they" can be coreferent with company names if they refer to the employee group of this company.
For this resource we have merged the following labels to form uniquely numbered clusters: coref, coref-evol, coref-iden, coref-iden.sb, coref-var, and ref. Coref-res and coref-res.prg are also included as clusters but not merged with any other label, nor each other.
Some notes about the annotation, but see also the CDT documentation: If conjunctions of entities are only referred to as a group, they are marked as one span. (e.g. if "Lise, Lone og Birthe" are only referred to as a group, e.g. by the plural pronoun "de"), "Line, Lone og Birthe" is marked as one span. The spans are generally as long as possible. Example: Det sidste gik ud over politikerne, da de i sin tid præsenterede [det første forslag til den milliard-dyre vandmiljøplan].
Singletons are not annotated. The annotation does not label attributative noun phrases that are connected through copula verbs such as to be. Name-initual appositive constructions are part of the same mention as the name. Generic pronouns (mainly "man" and "du") are not clustered unless they are part of a cluster, e.g. with a reflexive or possesive pronoun.
## Additional Information
### Dataset Curators
The work was conducted by Maria Jung Barrett, and has been uploaded to the Hugging Face Hub by [Dan Saattrup Nielsen](https://saattrupdan.github.io/) from the [The Alexandra
Institute](https://alexandra.dk/).
### Licensing Information
The dataset is licensed under the [CC BY 4.0
license](https://creativecommons.org/licenses/by/4.0/). |
nblinh63/twitter_dataset_1712702390 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 142254
num_examples: 382
download_size: 65329
dataset_size: 142254
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MohamedExperio/ICDAR2019 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
struct:
- name: gt_parses
sequence: string
splits:
- name: train
num_bytes: 170887114.0
num_examples: 300
- name: validation
num_bytes: 55500511.0
num_examples: 100
- name: test
num_bytes: 79123638.0
num_examples: 126
download_size: 0
dataset_size: 305511263.0
---
# Dataset Card for "ICDAR2019"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuan-sf63/chenyu_label_0.5_72 | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
- name: '32'
dtype: int64
- name: '33'
dtype: int64
- name: '34'
dtype: int64
- name: '35'
dtype: int64
- name: '36'
dtype: int64
- name: '37'
dtype: int64
- name: '38'
dtype: int64
- name: '39'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
- name: '64'
dtype: int64
- name: '65'
dtype: int64
- name: '66'
dtype: int64
- name: '67'
dtype: int64
- name: '68'
dtype: int64
- name: '69'
dtype: int64
- name: '70'
dtype: int64
- name: '71'
dtype: int64
splits:
- name: train
num_bytes: 25931250.588298276
num_examples: 37825
- name: validation
num_bytes: 2881402.4117017225
num_examples: 4203
download_size: 0
dataset_size: 28812653.0
---
# Dataset Card for "chenyu_label_0.5_72"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/v3_val_free_4 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 15366793992
num_examples: 10000
download_size: 2242262581
dataset_size: 15366793992
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
andrewverse/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mirfan899/movies_reviews_summaries | ---
license: mit
task_categories:
- text2text-generation
language:
- ur
pretty_name: imdb reviews
---
Dataset consists of imdb urdu reviews with summaries. |
Sampath1987/NER_cyber | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1041468
num_examples: 2481
download_size: 308097
dataset_size: 1041468
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_222gate__Blur-4x7b-MOE-v0.1 | ---
pretty_name: Evaluation run of 222gate/Blur-4x7b-MOE-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [222gate/Blur-4x7b-MOE-v0.1](https://huggingface.co/222gate/Blur-4x7b-MOE-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_222gate__Blur-4x7b-MOE-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T04:45:19.432784](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__Blur-4x7b-MOE-v0.1/blob/main/results_2024-01-21T04-45-19.432784.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6559352269047928,\n\
\ \"acc_stderr\": 0.03199697744913013,\n \"acc_norm\": 0.6556619411144256,\n\
\ \"acc_norm_stderr\": 0.03265823717914628,\n \"mc1\": 0.5458996328029376,\n\
\ \"mc1_stderr\": 0.017429593091323515,\n \"mc2\": 0.688217063142182,\n\
\ \"mc2_stderr\": 0.01518842298057346\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068749,\n\
\ \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059376\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7159928301135232,\n\
\ \"acc_stderr\": 0.004500186424443795,\n \"acc_norm\": 0.8813981278629756,\n\
\ \"acc_norm_stderr\": 0.003226586783421294\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250454,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250454\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\
\ \"acc_stderr\": 0.01655860163604104,\n \"acc_norm\": 0.4301675977653631,\n\
\ \"acc_norm_stderr\": 0.01655860163604104\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657473,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657473\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5458996328029376,\n\
\ \"mc1_stderr\": 0.017429593091323515,\n \"mc2\": 0.688217063142182,\n\
\ \"mc2_stderr\": 0.01518842298057346\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498435\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.689158453373768,\n \
\ \"acc_stderr\": 0.012748860507777725\n }\n}\n```"
repo_url: https://huggingface.co/222gate/Blur-4x7b-MOE-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|arc:challenge|25_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|gsm8k|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hellaswag|10_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-45-19.432784.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T04-45-19.432784.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- '**/details_harness|winogrande|5_2024-01-21T04-45-19.432784.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T04-45-19.432784.parquet'
- config_name: results
data_files:
- split: 2024_01_21T04_45_19.432784
path:
- results_2024-01-21T04-45-19.432784.parquet
- split: latest
path:
- results_2024-01-21T04-45-19.432784.parquet
---
# Dataset Card for Evaluation run of 222gate/Blur-4x7b-MOE-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [222gate/Blur-4x7b-MOE-v0.1](https://huggingface.co/222gate/Blur-4x7b-MOE-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_222gate__Blur-4x7b-MOE-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T04:45:19.432784](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__Blur-4x7b-MOE-v0.1/blob/main/results_2024-01-21T04-45-19.432784.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6559352269047928,
"acc_stderr": 0.03199697744913013,
"acc_norm": 0.6556619411144256,
"acc_norm_stderr": 0.03265823717914628,
"mc1": 0.5458996328029376,
"mc1_stderr": 0.017429593091323515,
"mc2": 0.688217063142182,
"mc2_stderr": 0.01518842298057346
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068749,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059376
},
"harness|hellaswag|10": {
"acc": 0.7159928301135232,
"acc_stderr": 0.004500186424443795,
"acc_norm": 0.8813981278629756,
"acc_norm_stderr": 0.003226586783421294
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250454,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250454
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.01655860163604104,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.01655860163604104
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657473,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657473
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5458996328029376,
"mc1_stderr": 0.017429593091323515,
"mc2": 0.688217063142182,
"mc2_stderr": 0.01518842298057346
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498435
},
"harness|gsm8k|5": {
"acc": 0.689158453373768,
"acc_stderr": 0.012748860507777725
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/istel_karin_renaiflops | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Istel Karin
This is the dataset of Istel Karin, containing 193 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 193 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 437 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 495 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 193 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 193 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 193 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 437 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 437 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 358 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 495 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 495 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
open-llm-leaderboard/details_MBZUAI__lamini-cerebras-256m | ---
pretty_name: Evaluation run of MBZUAI/lamini-cerebras-256m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MBZUAI/lamini-cerebras-256m](https://huggingface.co/MBZUAI/lamini-cerebras-256m)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MBZUAI__lamini-cerebras-256m\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T21:23:58.159302](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-cerebras-256m/blob/main/results_2023-10-18T21-23-58.159302.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004614093959731544,\n\
\ \"em_stderr\": 0.0006940305886353382,\n \"f1\": 0.0485601929530202,\n\
\ \"f1_stderr\": 0.001416776057030896,\n \"acc\": 0.2600631412786109,\n\
\ \"acc_stderr\": 0.007020548332172165\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.004614093959731544,\n \"em_stderr\": 0.0006940305886353382,\n\
\ \"f1\": 0.0485601929530202,\n \"f1_stderr\": 0.001416776057030896\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5201262825572218,\n\
\ \"acc_stderr\": 0.01404109666434433\n }\n}\n```"
repo_url: https://huggingface.co/MBZUAI/lamini-cerebras-256m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T21_23_58.159302
path:
- '**/details_harness|drop|3_2023-10-18T21-23-58.159302.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T21-23-58.159302.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T21_23_58.159302
path:
- '**/details_harness|gsm8k|5_2023-10-18T21-23-58.159302.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T21-23-58.159302.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:03:54.782051.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:03:54.782051.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:03:54.782051.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T21_23_58.159302
path:
- '**/details_harness|winogrande|5_2023-10-18T21-23-58.159302.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T21-23-58.159302.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_03_54.782051
path:
- results_2023-07-19T14:03:54.782051.parquet
- split: 2023_10_18T21_23_58.159302
path:
- results_2023-10-18T21-23-58.159302.parquet
- split: latest
path:
- results_2023-10-18T21-23-58.159302.parquet
---
# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-256m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MBZUAI/lamini-cerebras-256m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [MBZUAI/lamini-cerebras-256m](https://huggingface.co/MBZUAI/lamini-cerebras-256m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MBZUAI__lamini-cerebras-256m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T21:23:58.159302](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-cerebras-256m/blob/main/results_2023-10-18T21-23-58.159302.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004614093959731544,
"em_stderr": 0.0006940305886353382,
"f1": 0.0485601929530202,
"f1_stderr": 0.001416776057030896,
"acc": 0.2600631412786109,
"acc_stderr": 0.007020548332172165
},
"harness|drop|3": {
"em": 0.004614093959731544,
"em_stderr": 0.0006940305886353382,
"f1": 0.0485601929530202,
"f1_stderr": 0.001416776057030896
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5201262825572218,
"acc_stderr": 0.01404109666434433
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Syoy/preprocessed_birdclef_2023_train | ---
dataset_info:
features:
- name: primary_label
dtype:
class_label:
names:
'0': yetgre1
'1': moccha1
'2': rostur1
'3': walsta1
'4': ratcis1
'5': norfis1
'6': macshr1
'7': brrwhe3
'8': crefra2
'9': pabspa1
'10': sltnig1
'11': cabgre1
'12': equaka1
'13': sobfly1
'14': rindov
'15': wlwwar
'16': brwwar1
'17': gnbcam2
'18': carcha1
'19': abethr1
'20': yertin1
'21': spewea1
'22': varsun2
'23': yebduc1
'24': eubeat1
'25': hadibi1
'26': brcale1
'27': litwea1
'28': sincis1
'29': whbcro2
'30': thrnig1
'31': bubwar2
'32': kvbsun1
'33': blbpuf2
'34': blakit1
'35': colsun2
'36': bltapa1
'37': gycwar3
'38': joygre1
'39': greegr
'40': vibsta2
'41': wtbeat1
'42': afrgos1
'43': rebfir2
'44': yebgre1
'45': comsan
'46': pygbat1
'47': meypar1
'48': yelbis1
'49': norbro1
'50': ndcsun2
'51': gybfis1
'52': reftin1
'53': brobab1
'54': refwar2
'55': norcro1
'56': yebapa1
'57': yewgre1
'58': palfly2
'59': gargan
'60': darter3
'61': rerswa1
'62': augbuz1
'63': gyhbus1
'64': refcro1
'65': witswa1
'66': gryapa1
'67': pitwhy
'68': eaywag1
'69': blhgon1
'70': yebsto1
'71': hipbab1
'72': whcpri2
'73': spemou2
'74': gobsta5
'75': blksaw1
'76': afecuc1
'77': spepig1
'78': mabeat1
'79': rewsta1
'80': rebhor1
'81': brtcha1
'82': blacuc1
'83': brican1
'84': rehblu1
'85': gobbun1
'86': supsta1
'87': bkfruw1
'88': litswi1
'89': spmthr1
'90': spwlap1
'91': quailf1
'92': golher1
'93': didcuc1
'94': gytbar1
'95': klacuc1
'96': afbfly1
'97': brcsta1
'98': bawhor2
'99': whihel1
'100': yespet1
'101': dotbar1
'102': luebus1
'103': yeccan1
'104': tafpri1
'105': chespa1
'106': blacra1
'107': scthon1
'108': whbcou1
'109': ccbeat1
'110': libeat1
'111': whctur2
'112': butapa1
'113': norpuf1
'114': blwlap1
'115': afmdov1
'116': hartur1
'117': beasun2
'118': vimwea1
'119': squher1
'120': yebbar1
'121': bltori1
'122': sccsun2
'123': piecro1
'124': chibat1
'125': marsto1
'126': afpfly1
'127': bcbeat1
'128': wbswea1
'129': yebere1
'130': rbsrob1
'131': brcwea1
'132': bswdov1
'133': kerspa2
'134': slcbou1
'135': fislov1
'136': cohmar1
'137': lesmaw1
'138': cibwar1
'139': woosan
'140': shesta1
'141': reccor
'142': gnhsun1
'143': chucis1
'144': fatrav1
'145': slbgre1
'146': afghor1
'147': afrjac1
'148': abhori1
'149': wbgbir1
'150': subbus1
'151': bawman1
'152': whrshr1
'153': hoopoe
'154': lessts1
'155': rocmar2
'156': lotlap1
'157': tamdov1
'158': rufcha2
'159': palpri1
'160': reboxp1
'161': chewea1
'162': malkin1
'163': vilwea1
'164': reccuc1
'165': bltbar1
'166': trobou1
'167': abythr1
'168': broman1
'169': easmog1
'170': spfbar1
'171': afpwag1
'172': refbar2
'173': strher
'174': whhsaw1
'175': grbcam1
'176': sichor1
'177': crheag1
'178': wookin1
'179': helgui
'180': strsee1
'181': chtapa3
'182': grccra1
'183': brubru1
'184': wbrcha2
'185': bkctch1
'186': yesbar1
'187': scrcha1
'188': affeag1
'189': grwpyt1
'190': whbtit5
'191': spfwea1
'192': brosun1
'193': combuz1
'194': tacsun1
'195': darbar1
'196': grewoo2
'197': purgre2
'198': grecor
'199': whbcan1
'200': afrgrp1
'201': mouwag1
'202': bagwea1
'203': eswdov1
'204': blfbus1
'205': soucit1
'206': blnmou1
'207': gbesta1
'208': whbwhe3
'209': somgre1
'210': afrthr1
'211': carwoo1
'212': yenspu1
'213': gobwea1
'214': wfbeat1
'215': blnwea1
'216': soufis1
'217': hunsun2
'218': nobfly1
'219': gyhkin1
'220': nubwoo1
'221': afpkin1
'222': marsun2
'223': gabgos2
'224': yefcan
'225': btweye2
'226': huncis1
'227': raybar1
'228': dutdov1
'229': gyhneg1
'230': stusta1
'231': wheslf1
'232': somtit4
'233': mcptit1
'234': whbwea1
'235': lawgol
'236': combul2
'237': gyhspa1
'238': ruegls1
'239': fotdro5
'240': afdfly1
'241': sacibi2
'242': hamerk1
'243': piekin1
'244': afgfly1
'245': reisee2
'246': amesun2
'247': laudov1
'248': grywrw1
'249': blhher1
'250': loceag1
'251': crohor1
'252': lotcor1
'253': brctch1
'254': barswa
'255': categr
'256': reedov1
'257': blaplo1
'258': litegr
'259': egygoo
'260': rehwea1
'261': fatwid1
'262': blcapa2
'263': edcsun3
- name: secondary_labels
dtype: string
- name: input_values
sequence:
sequence: float32
splits:
- name: train
num_bytes: 8951693048
num_examples: 16941
download_size: 8380125082
dataset_size: 8951693048
---
# Dataset Card for "preprocessed_birdclef_2023_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Deojoandco/ah_openai_dialog_annotation_test | ---
dataset_info:
features:
- name: url
dtype: string
- name: id
dtype: string
- name: num_comments
dtype: int64
- name: name
dtype: string
- name: title
dtype: string
- name: body
dtype: string
- name: score
dtype: int64
- name: upvote_ratio
dtype: float64
- name: distinguished
dtype: string
- name: over_18
dtype: bool
- name: created_utc
dtype: int64
- name: comments
list:
- name: body
dtype: string
- name: created_utc
dtype: float64
- name: distinguished
dtype: string
- name: id
dtype: string
- name: permalink
dtype: string
- name: score
dtype: int64
- name: best_num_comments
dtype: int64
- name: query
dtype: string
- name: dialog
dtype: string
- name: annotation_error
dtype: bool
- name: annotation
struct:
- name: Error
dtype: string
- name: success
dtype: bool
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3380584
num_examples: 292
download_size: 2027925
dataset_size: 3380584
---
# Dataset Card for "ah_openai_dialog_annotation_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sourcegraph/code-completion-finetune-rb-rs-m-go-800K-v1 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: full-code-of-function
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 874986467
num_examples: 569987
- name: validation
num_bytes: 187722652
num_examples: 122140
- name: test
num_bytes: 187578982
num_examples: 122141
download_size: 518749768
dataset_size: 1250288101
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
michel79/Perfex | ---
license: mit
---
|
princeton-nlp/SWE-bench_bm25_50k_llama | ---
dataset_info:
features:
- name: base_commit
dtype: string
- name: hints_text
dtype: string
- name: created_at
dtype: string
- name: test_patch
dtype: string
- name: repo
dtype: string
- name: problem_statement
dtype: string
- name: version
dtype: string
- name: instance_id
dtype: string
- name: FAIL_TO_PASS
dtype: string
- name: PASS_TO_PASS
dtype: string
- name: environment_setup_commit
dtype: string
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: labels
sequence: int64
- name: patch
dtype: string
splits:
- name: test
num_bytes: 829959640
num_examples: 2294
download_size: 319570091
dataset_size: 829959640
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
### Dataset Summary
SWE-bench is a dataset that tests systems’ ability to solve GitHub issues automatically. The dataset collects 2,294 Issue-Pull Request pairs from 12 popular Python. Evaluation is performed by unit test verification using post-PR behavior as the reference solution.
### Supported Tasks and Leaderboards
SWE-bench proposes a new task: issue resolution provided a full repository and GitHub issue. The leaderboard can be found at www.swebench.com
### Languages
The text of the dataset is primarily English, but we make no effort to filter or otherwise clean based on language type.
## Dataset Structure
### Data Instances
An example of a SWE-bench datum is as follows:
```
instance_id: (str) - A formatted instance identifier, usually as repo_owner__repo_name-PR-number.
patch: (str) - The gold patch, the patch generated by the PR (minus test-related code), that resolved the issue.
repo: (str) - The repository owner/name identifier from GitHub.
base_commit: (str) - The commit hash of the repository representing the HEAD of the repository before the solution PR is applied.
hints_text: (str) - Comments made on the issue prior to the creation of the solution PR’s first commit creation date.
created_at: (str) - The creation date of the pull request.
test_patch: (str) - A test-file patch that was contributed by the solution PR.
Problem_statement: (str) - The issue title and body.
Version: (str) - Installation version to use for running evaluation.
environment_setup_commit: (str) - commit hash to use for environment setup and installation.
FAIL_TO_PASS: (str) - A json list of strings that represent the set of tests resolved by the PR and tied to the issue resolution.
PASS_TO_PASS: (str) - A json list of strings that represent tests that should pass before and after the PR application.
text: (str) - The generated text according to the retrieval criterion and the style-2 prompt found in [github:SWE-bench](https://github.com/princeton-nlp/SWE-bench).
input_ids: (List[int]) - The llama tokens for each text.
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_default-of-credit-card-clients_gosdt_l512_d3_sd1 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 10863200000
num_examples: 100000
- name: validation
num_bytes: 1086320000
num_examples: 10000
download_size: 2039303021
dataset_size: 11949520000
---
# Dataset Card for "autotree_automl_default-of-credit-card-clients_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/bocchitherock | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Bocchi The Rock!
This is the image base of bangumi Bocchi the Rock!, we detected 23 characters, 2223 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 538 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 54 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 35 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 13 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 286 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 108 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 8 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 88 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 14 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 439 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 66 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 8 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 257 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 29 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 6 | [Download](14/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 15 | 9 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 14 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 6 | [Download](17/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 18 | 14 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 12 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 13 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 9 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 197 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv0.9 | ---
pretty_name: Evaluation run of YKM11/Mistral-7B-adaptv0.9
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YKM11/Mistral-7B-adaptv0.9](https://huggingface.co/YKM11/Mistral-7B-adaptv0.9)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv0.9\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T17:00:46.395617](https://huggingface.co/datasets/open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv0.9/blob/main/results_2024-02-02T17-00-46.395617.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6534693218977564,\n\
\ \"acc_stderr\": 0.03204118105946018,\n \"acc_norm\": 0.652862155312837,\n\
\ \"acc_norm_stderr\": 0.03271329249335617,\n \"mc1\": 0.5887392900856793,\n\
\ \"mc1_stderr\": 0.017225627083660874,\n \"mc2\": 0.7311817247902683,\n\
\ \"mc2_stderr\": 0.014597852035553836\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520767,\n\
\ \"acc_norm\": 0.735494880546075,\n \"acc_norm_stderr\": 0.012889272949313368\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.722266480780721,\n\
\ \"acc_stderr\": 0.004469659042824775,\n \"acc_norm\": 0.8895638319059949,\n\
\ \"acc_norm_stderr\": 0.0031279207383941086\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473086,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468358,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468358\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n\
\ \"acc_stderr\": 0.016578997435496713,\n \"acc_norm\": 0.4346368715083799,\n\
\ \"acc_norm_stderr\": 0.016578997435496713\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.01274724896707907,\n\
\ \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.01274724896707907\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n \"\
acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5887392900856793,\n\
\ \"mc1_stderr\": 0.017225627083660874,\n \"mc2\": 0.7311817247902683,\n\
\ \"mc2_stderr\": 0.014597852035553836\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.856353591160221,\n \"acc_stderr\": 0.009857280052696737\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6793025018953753,\n \
\ \"acc_stderr\": 0.012856468433722283\n }\n}\n```"
repo_url: https://huggingface.co/YKM11/Mistral-7B-adaptv0.9
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|arc:challenge|25_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|gsm8k|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hellaswag|10_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T17-00-46.395617.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T17-00-46.395617.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- '**/details_harness|winogrande|5_2024-02-02T17-00-46.395617.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T17-00-46.395617.parquet'
- config_name: results
data_files:
- split: 2024_02_02T17_00_46.395617
path:
- results_2024-02-02T17-00-46.395617.parquet
- split: latest
path:
- results_2024-02-02T17-00-46.395617.parquet
---
# Dataset Card for Evaluation run of YKM11/Mistral-7B-adaptv0.9
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YKM11/Mistral-7B-adaptv0.9](https://huggingface.co/YKM11/Mistral-7B-adaptv0.9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv0.9",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T17:00:46.395617](https://huggingface.co/datasets/open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv0.9/blob/main/results_2024-02-02T17-00-46.395617.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6534693218977564,
"acc_stderr": 0.03204118105946018,
"acc_norm": 0.652862155312837,
"acc_norm_stderr": 0.03271329249335617,
"mc1": 0.5887392900856793,
"mc1_stderr": 0.017225627083660874,
"mc2": 0.7311817247902683,
"mc2_stderr": 0.014597852035553836
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520767,
"acc_norm": 0.735494880546075,
"acc_norm_stderr": 0.012889272949313368
},
"harness|hellaswag|10": {
"acc": 0.722266480780721,
"acc_stderr": 0.004469659042824775,
"acc_norm": 0.8895638319059949,
"acc_norm_stderr": 0.0031279207383941086
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473086,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468358,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468358
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.016578997435496713,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.016578997435496713
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.01274724896707907,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.01274724896707907
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5887392900856793,
"mc1_stderr": 0.017225627083660874,
"mc2": 0.7311817247902683,
"mc2_stderr": 0.014597852035553836
},
"harness|winogrande|5": {
"acc": 0.856353591160221,
"acc_stderr": 0.009857280052696737
},
"harness|gsm8k|5": {
"acc": 0.6793025018953753,
"acc_stderr": 0.012856468433722283
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bigbio/geokhoj_v1 |
---
language:
- en
bigbio_language:
- English
license: cc-by-nc-4.0
multilinguality: monolingual
bigbio_license_shortname: CC_BY_NC_4p0
pretty_name: GEOKhoj v1
homepage: https://github.com/ElucidataInc/GEOKhoj-datasets/tree/main/geokhoj_v1
bigbio_pubmed: False
bigbio_public: True
bigbio_tasks:
- TEXT_CLASSIFICATION
---
# Dataset Card for GEOKhoj v1
## Dataset Description
- **Homepage:** https://github.com/ElucidataInc/GEOKhoj-datasets/tree/main/geokhoj_v1
- **Pubmed:** False
- **Public:** True
- **Tasks:** TXTCLASS
GEOKhoj v1 is a annotated corpus of control/perturbation labels for 30,000 samples
from Microarray, Transcriptomics and Single cell experiments which are available on
the GEO (Gene Expression Omnibus) database
## Citation Information
```
@misc{geokhoj_v1,
author = {Elucidata, Inc.},
title = {GEOKhoj v1},
howpublished = {\url{https://github.com/ElucidataInc/GEOKhoj-datasets/tree/main/geokhoj_v1}},
}
```
|
nithin1995/dfc_sroie_caption1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 560598966.0
num_examples: 973
download_size: 499271738
dataset_size: 560598966.0
---
# Dataset Card for "dfc_sroie_caption1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sajjad-Sh33/train_ds | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 7890021820.998
num_examples: 49438
download_size: 483953994
dataset_size: 7890021820.998
---
# Dataset Card for "train_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/95de681c | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1338
dataset_size: 182
---
# Dataset Card for "95de681c"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
suchirsalhan/SLING | ---
license: mit
language:
- zh
tags:
- Syntax
pretty_name: SLING
size_categories:
- n<1K
---
# SLING: Sino-Linguistic Evaluation of Large Language Models
[](http://arxiv.org/abs/2210.11689)
This is the official SLING dataset, accompanying the EMNLP 2022 paper "SLING: Sino-Linguistic Evaluation of Large Language Models" by Yixiao Song♢ Kalpesh Krishna♠ Rajesh Bhatt♢ Mohit Iyyer♠.
You can find the paper on [arxiv](https://arxiv.org/abs/2210.11689).
We use this dataset for evaluation of a small-scale Chinese Language Model for the [BabyLM Challenge](https://babylm.github.io/).
## SLING Dataset
See [`SLING_Data`](SLING_Data) and the readme file in it.
A complete list of all phenomea and paradigms can be found in [`PhenomenonParadigmList.txt`](PhenomenonParadigmList.txt)
## Citation Information
If you use SLING, please cite the **original paper** as follows:
```
@inproceedings{sling22,
author={Yixiao Song and Kalpesh Krishna and Rajesh Bhatt and Mohit Iyyer},
booktitle = {Empirical Methods in Natural Language Processing},
Year = "2022",
Title={SLING: Sino Linguistic Evaluation of Large Language Models},
}
``` |
CyberHarem/chevreuse_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of chevreuse/シュヴルーズ/夏沃蕾 (Genshin Impact)
This is the dataset of chevreuse/シュヴルーズ/夏沃蕾 (Genshin Impact), containing 172 images and their tags.
The core tags of this character are `purple_hair, long_hair, very_long_hair, streaked_hair, multicolored_hair, purple_eyes, two-tone_hair, white_hair, hat, eyepatch, pointy_hair, mole, shako_cap, mole_under_mouth, hair_between_eyes, bright_pupils, white_pupils, black_headwear, crossed_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 172 | 351.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chevreuse_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 172 | 293.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chevreuse_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 437 | 588.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chevreuse_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chevreuse_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, antique_firearm, bare_shoulders, black_necktie, detached_collar, earmuffs, gold_trim, holding_gun, looking_at_viewer, puffy_detached_sleeves, red_dress, rifle, solo, white_gloves, black_dress, simple_background, two-tone_dress, white_background, strapless_dress, no_mole |
| 1 | 19 |  |  |  |  |  | 1girl, antique_firearm, bare_shoulders, black_dress, black_necktie, earmuffs, gold_trim, holding_gun, puffy_detached_sleeves, red_dress, solo, two-tone_dress, white_gloves, rifle, strapless_dress, thigh_boots, detached_collar, pantyhose, white_footwear, looking_at_viewer, standing, thighhighs, cowboy_shot, no_mole, white_background |
| 2 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_necktie, detached_collar, earmuffs, gold_trim, puffy_detached_sleeves, red_dress, solo, strapless_dress, upper_body, looking_at_viewer, simple_background, white_background, white_gloves, hand_up |
| 3 | 26 |  |  |  |  |  | 1girl, bare_shoulders, earmuffs, holding_food, puffy_detached_sleeves, solo, white_gloves, looking_at_viewer, french_fries, gold_trim, detached_collar, red_dress, black_necktie, upper_body, black_dress, two-tone_dress, white_background, :t, eating, simple_background, strapless_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | antique_firearm | bare_shoulders | black_necktie | detached_collar | earmuffs | gold_trim | holding_gun | looking_at_viewer | puffy_detached_sleeves | red_dress | rifle | solo | white_gloves | black_dress | simple_background | two-tone_dress | white_background | strapless_dress | no_mole | thigh_boots | pantyhose | white_footwear | standing | thighhighs | cowboy_shot | upper_body | hand_up | holding_food | french_fries | :t | eating |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------------|:-----------------|:----------------|:------------------|:-----------|:------------|:--------------|:--------------------|:-------------------------|:------------|:--------|:-------|:---------------|:--------------|:--------------------|:-----------------|:-------------------|:------------------|:----------|:--------------|:------------|:-----------------|:-----------|:-------------|:--------------|:-------------|:----------|:---------------|:---------------|:-----|:---------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | X | X | X | X | | X | X | X | | X | X | | X | | X | X | | | | | | | | X | X | | | | |
| 3 | 26 |  |  |  |  |  | X | | X | X | X | X | X | | X | X | X | | X | X | X | X | X | X | X | | | | | | | | X | | X | X | X | X |
|
monkeydance/banana_tree | ---
license: cc0-1.0
---
|
yenstdi/gpt_chat_customer | ---
dataset_info:
features:
- name: context
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1490
num_examples: 10
download_size: 2959
dataset_size: 1490
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gpt_chat_customer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_mlabonne__NeuralBeagle14-7B | ---
pretty_name: Evaluation run of mlabonne/NeuralBeagle14-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__NeuralBeagle14-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T01:08:26.815622](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralBeagle14-7B/blob/main/results_2024-01-16T01-08-26.815622.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6515140207831388,\n\
\ \"acc_stderr\": 0.03220550517246716,\n \"acc_norm\": 0.6509542922384997,\n\
\ \"acc_norm_stderr\": 0.03287465661696305,\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.017316834410963915,\n \"mc2\": 0.6992620055732494,\n\
\ \"mc2_stderr\": 0.015067252053266866\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7047781569965871,\n \"acc_stderr\": 0.013329750293382316,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7173869747062338,\n\
\ \"acc_stderr\": 0.00449349587200011,\n \"acc_norm\": 0.8833897629954193,\n\
\ \"acc_norm_stderr\": 0.0032029933469910595\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608311,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608311\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546835,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\
\ \"acc_stderr\": 0.012750151802922436,\n \"acc_norm\": 0.47196870925684486,\n\
\ \"acc_norm_stderr\": 0.012750151802922436\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.017316834410963915,\n \"mc2\": 0.6992620055732494,\n\
\ \"mc2_stderr\": 0.015067252053266866\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \
\ \"acc_stderr\": 0.012588685966624184\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/NeuralBeagle14-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|arc:challenge|25_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|gsm8k|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hellaswag|10_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T01-08-26.815622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T01-08-26.815622.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- '**/details_harness|winogrande|5_2024-01-16T01-08-26.815622.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T01-08-26.815622.parquet'
- config_name: results
data_files:
- split: 2024_01_16T01_08_26.815622
path:
- results_2024-01-16T01-08-26.815622.parquet
- split: latest
path:
- results_2024-01-16T01-08-26.815622.parquet
---
# Dataset Card for Evaluation run of mlabonne/NeuralBeagle14-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__NeuralBeagle14-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T01:08:26.815622](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralBeagle14-7B/blob/main/results_2024-01-16T01-08-26.815622.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6515140207831388,
"acc_stderr": 0.03220550517246716,
"acc_norm": 0.6509542922384997,
"acc_norm_stderr": 0.03287465661696305,
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963915,
"mc2": 0.6992620055732494,
"mc2_stderr": 0.015067252053266866
},
"harness|arc:challenge|25": {
"acc": 0.7047781569965871,
"acc_stderr": 0.013329750293382316,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7173869747062338,
"acc_stderr": 0.00449349587200011,
"acc_norm": 0.8833897629954193,
"acc_norm_stderr": 0.0032029933469910595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608311,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608311
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546835,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922436,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922436
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963915,
"mc2": 0.6992620055732494,
"mc2_stderr": 0.015067252053266866
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320705
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624184
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Cosmas/FAQ-embedding | ---
license: cc-by-4.0
---
|
EddieChen372/devign_with_vul_lines | ---
dataset_info:
features:
- name: id
dtype: int32
- name: func
dtype: string
- name: target
dtype: bool
- name: project
dtype: string
- name: commit_id
dtype: string
- name: func_clean
dtype: string
- name: vul_lines
struct:
- name: code
sequence: string
- name: line_no
sequence: int64
- name: normalized_func
dtype: string
splits:
- name: validation
num_bytes: 16112369
num_examples: 2732
- name: train
num_bytes: 132054560
num_examples: 21854
- name: test
num_bytes: 16328301
num_examples: 2732
download_size: 60272537
dataset_size: 164495230
---
# Dataset Card for "devign_with_vul_lines"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
presencesw/dataset3 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: references
sequence: string
splits:
- name: train
num_bytes: 23125707
num_examples: 9000
download_size: 14110602
dataset_size: 23125707
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
italobovier/Vicent_Price | ---
license: apache-2.0
---
|
AdapterOcean/oasst_top1_standardized_cluster_2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 22311787
num_examples: 2175
download_size: 6788782
dataset_size: 22311787
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oasst_top1_standardized_cluster_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_shahzebnaveed__StarlingHermes-2.5-Mistral-7B-slerp | ---
pretty_name: Evaluation run of shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp](https://huggingface.co/shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shahzebnaveed__StarlingHermes-2.5-Mistral-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T18:45:36.411445](https://huggingface.co/datasets/open-llm-leaderboard/details_shahzebnaveed__StarlingHermes-2.5-Mistral-7B-slerp/blob/main/results_2024-02-16T18-45-36.411445.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6499656745636075,\n\
\ \"acc_stderr\": 0.03190074035438905,\n \"acc_norm\": 0.6509134828280464,\n\
\ \"acc_norm_stderr\": 0.032545406816765036,\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.49557311062145515,\n\
\ \"mc2_stderr\": 0.015305674753451043\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042198,\n\
\ \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.013839039762820169\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6680940051782513,\n\
\ \"acc_stderr\": 0.0046993506536956225,\n \"acc_norm\": 0.851822346146186,\n\
\ \"acc_norm_stderr\": 0.0035454991695580535\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493875,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8032258064516129,\n \"acc_stderr\": 0.02261640942074202,\n \"\
acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.02261640942074202\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291943,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291943\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098822,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098822\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903338,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903338\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4044692737430168,\n\
\ \"acc_stderr\": 0.016414440917293147,\n \"acc_norm\": 0.4044692737430168,\n\
\ \"acc_norm_stderr\": 0.016414440917293147\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.012751977967676008,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.012751977967676008\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017493,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160882,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160882\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.49557311062145515,\n\
\ \"mc2_stderr\": 0.015305674753451043\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936648\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6595905989385898,\n \
\ \"acc_stderr\": 0.013052097103299099\n }\n}\n```"
repo_url: https://huggingface.co/shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|arc:challenge|25_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|arc:challenge|25_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|gsm8k|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|gsm8k|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hellaswag|10_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hellaswag|10_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T18-34-16.912824.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T18-45-36.411445.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T18-45-36.411445.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- '**/details_harness|winogrande|5_2024-02-16T18-34-16.912824.parquet'
- split: 2024_02_16T18_45_36.411445
path:
- '**/details_harness|winogrande|5_2024-02-16T18-45-36.411445.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T18-45-36.411445.parquet'
- config_name: results
data_files:
- split: 2024_02_16T18_34_16.912824
path:
- results_2024-02-16T18-34-16.912824.parquet
- split: 2024_02_16T18_45_36.411445
path:
- results_2024-02-16T18-45-36.411445.parquet
- split: latest
path:
- results_2024-02-16T18-45-36.411445.parquet
---
# Dataset Card for Evaluation run of shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp](https://huggingface.co/shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shahzebnaveed__StarlingHermes-2.5-Mistral-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T18:45:36.411445](https://huggingface.co/datasets/open-llm-leaderboard/details_shahzebnaveed__StarlingHermes-2.5-Mistral-7B-slerp/blob/main/results_2024-02-16T18-45-36.411445.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6499656745636075,
"acc_stderr": 0.03190074035438905,
"acc_norm": 0.6509134828280464,
"acc_norm_stderr": 0.032545406816765036,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.49557311062145515,
"mc2_stderr": 0.015305674753451043
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042198,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.013839039762820169
},
"harness|hellaswag|10": {
"acc": 0.6680940051782513,
"acc_stderr": 0.0046993506536956225,
"acc_norm": 0.851822346146186,
"acc_norm_stderr": 0.0035454991695580535
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493875,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.02261640942074202,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.02261640942074202
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291943,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291943
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098822,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098822
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903338,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903338
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4044692737430168,
"acc_stderr": 0.016414440917293147,
"acc_norm": 0.4044692737430168,
"acc_norm_stderr": 0.016414440917293147
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.012751977967676008,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.012751977967676008
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.02752963744017493,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.02752963744017493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160882,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160882
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.49557311062145515,
"mc2_stderr": 0.015305674753451043
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936648
},
"harness|gsm8k|5": {
"acc": 0.6595905989385898,
"acc_stderr": 0.013052097103299099
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
malhajar/LessWrong-Amplify-Instruct-tr | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversation
dtype: string
- name: conversation-turkish
dtype: string
splits:
- name: train
num_bytes: 15904797
num_examples: 663
download_size: 8637588
dataset_size: 15904797
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nthakur/msmarco-passage-sampled-10k | ---
language:
- en
license: cc-by-sa-3.0
task_categories:
- text-retrieval
source_datasets:
- Tevatron/msmarco-passage
---
# nthakur/msmarco-passage-sampled-10k
This is a 10k randomly sampled training pairs of the Tevatron [msmarco-passage](https://huggingface.co/datasets/Tevatron/msmarco-passage) for debugging and training models on a smaller subset of MSMARCO training data.
## Citing & Authors
Have a look at [Tevatron](https://github.com/texttron/tevatron).
<!--- Describe where people can find more information --> |
datahrvoje/twitter_dataset_1713039372 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20518
num_examples: 45
download_size: 12011
dataset_size: 20518
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Dampish/MPTE_dante | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 3116308
num_examples: 300
download_size: 885170
dataset_size: 3116308
---
# Dataset Card for "MPTE_dante"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SonJS/test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2486
num_examples: 12
download_size: 2630
dataset_size: 2486
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/ahri_leagueoflegends | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ahri (League of Legends)
This is the dataset of ahri (League of Legends), containing 500 images and their tags.
The core tags of this character are `animal_ears, fox_ears, long_hair, breasts, facial_mark, tail, fox_tail, large_breasts, yellow_eyes, multiple_tails, black_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 828.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ahri_leagueoflegends/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 472.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ahri_leagueoflegends/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1180 | 937.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ahri_leagueoflegends/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 729.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ahri_leagueoflegends/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1180 | 1.28 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ahri_leagueoflegends/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ahri_leagueoflegends',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, alternate_costume, alternate_hair_color, looking_at_viewer, solo, cleavage, smile, whisker_markings, blonde_hair, peaked_cap, belt, cosplay, idol, open_jacket, short_shorts, hat_bow, legwear_under_shorts, epaulettes, zipper, heart_necklace, long_sleeves, black_pantyhose, signature, swept_bangs, cowboy_shot, headset, one_eye_closed, open_mouth, very_long_hair, artist_name, brown_pantyhose, standing |
| 1 | 13 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, korean_clothes, solo, whisker_markings, cleavage, energy_ball, looking_at_viewer, parted_lips |
| 2 | 7 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, detached_sleeves, korean_clothes, simple_background, solo, white_background, whisker_markings |
| 3 | 10 |  |  |  |  |  | 1girl, blonde_hair, k/da_(league_of_legends), solo, whisker_markings, hairclip, looking_at_viewer, midriff, black_skirt, blue_eyes, crop_top, parted_lips, pink_hair, artist_name, bow, juliet_sleeves, multicolored_hair, navel, black_thighhighs, makeup, smile |
| 4 | 43 |  |  |  |  |  | 1girl, blonde_hair, heart, k/da_(league_of_legends), solo, bracelet, looking_at_viewer, choker, whisker_markings, bare_shoulders, cleavage, earrings, idol, swept_bangs, black_thighhighs, smile, parted_lips, leotard, makeup |
| 5 | 5 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, navel, red_bikini, solo, whisker_markings, collarbone, sitting, smile, artist_name, medium_breasts, nail_polish, nose, parted_lips, signature, slit_pupils, very_long_hair |
| 6 | 13 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, hair_bell, looking_at_viewer, pink_hair, solo, whisker_markings, smile, blue_eyes, kimono, animal_ear_fluff, hair_between_eyes, hair_ribbon, off_shoulder, pink_nails, upper_body, blush, choker, fingernails, low_neckline, nail_polish |
| 7 | 8 |  |  |  |  |  | 1girl, bare_shoulders, hair_ornament, skirt, solo, star_guardian_(league_of_legends), blonde_hair, detached_sleeves, magical_girl, looking_at_viewer, white_thighhighs, heart, purple_eyes, choker, medium_breasts, zettai_ryouiki, fox_girl, full_body, high_heels, one_eye_closed, parted_lips, pink_hair, smile, star_(symbol), thigh_boots |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | alternate_costume | alternate_hair_color | looking_at_viewer | solo | cleavage | smile | whisker_markings | blonde_hair | peaked_cap | belt | cosplay | idol | open_jacket | short_shorts | hat_bow | legwear_under_shorts | epaulettes | zipper | heart_necklace | long_sleeves | black_pantyhose | signature | swept_bangs | cowboy_shot | headset | one_eye_closed | open_mouth | very_long_hair | artist_name | brown_pantyhose | standing | bare_shoulders | detached_sleeves | korean_clothes | energy_ball | parted_lips | simple_background | white_background | k/da_(league_of_legends) | hairclip | midriff | black_skirt | blue_eyes | crop_top | pink_hair | bow | juliet_sleeves | multicolored_hair | navel | black_thighhighs | makeup | heart | bracelet | choker | earrings | leotard | red_bikini | collarbone | sitting | medium_breasts | nail_polish | nose | slit_pupils | hair_bell | kimono | animal_ear_fluff | hair_between_eyes | hair_ribbon | off_shoulder | pink_nails | upper_body | blush | fingernails | low_neckline | hair_ornament | skirt | star_guardian_(league_of_legends) | magical_girl | white_thighhighs | purple_eyes | zettai_ryouiki | fox_girl | full_body | high_heels | star_(symbol) | thigh_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-----------------------|:--------------------|:-------|:-----------|:--------|:-------------------|:--------------|:-------------|:-------|:----------|:-------|:--------------|:---------------|:----------|:-----------------------|:-------------|:---------|:-----------------|:---------------|:------------------|:------------|:--------------|:--------------|:----------|:-----------------|:-------------|:-----------------|:--------------|:------------------|:-----------|:-----------------|:-------------------|:-----------------|:--------------|:--------------|:--------------------|:-------------------|:---------------------------|:-----------|:----------|:--------------|:------------|:-----------|:------------|:------|:-----------------|:--------------------|:--------|:-------------------|:---------|:--------|:-----------|:---------|:-----------|:----------|:-------------|:-------------|:----------|:-----------------|:--------------|:-------|:--------------|:------------|:---------|:-------------------|:--------------------|:--------------|:---------------|:-------------|:-------------|:--------|:--------------|:---------------|:----------------|:--------|:------------------------------------|:---------------|:-------------------|:--------------|:-----------------|:-----------|:------------|:-------------|:----------------|:--------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | | | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 43 |  |  |  |  |  | X | | | X | X | X | X | X | X | | | | X | | | | | | | | | | | X | | | | | | | | | X | | | | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | X | X | X | X | | | | | | | | | | | | | | | X | | | | | | X | X | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 13 |  |  |  |  |  | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | X | | | | | | | | | X | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | | | X | X | | X | | X | | | | | | | | | | | | | | | | | | X | | | | | | X | X | | | X | | | | | | | | | X | | | | | | | X | | X | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
income/arguana-top-20-gen-queries | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
---
# NFCorpus: 20 generated queries (BEIR Benchmark)
This HF dataset contains the top-20 synthetic queries generated for each passage in the above BEIR benchmark dataset.
- DocT5query model used: [BeIR/query-gen-msmarco-t5-base-v1](https://huggingface.co/BeIR/query-gen-msmarco-t5-base-v1)
- id (str): unique document id in NFCorpus in the BEIR benchmark (`corpus.jsonl`).
- Questions generated: 20
- Code used for generation: [evaluate_anserini_docT5query_parallel.py](https://github.com/beir-cellar/beir/blob/main/examples/retrieval/evaluation/sparse/evaluate_anserini_docT5query_parallel.py)
Below contains the old dataset card for the BEIR benchmark.
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset.Top-20 generated queries for every passage in NFCorpus
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3-Slerp | ---
pretty_name: Evaluation run of Intel/neural-chat-7b-v3-3-Slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Intel/neural-chat-7b-v3-3-Slerp](https://huggingface.co/Intel/neural-chat-7b-v3-3-Slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3-Slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T17:57:49.451204](https://huggingface.co/datasets/open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3-Slerp/blob/main/results_2023-12-10T17-57-49.451204.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6272160356239721,\n\
\ \"acc_stderr\": 0.03276418695667091,\n \"acc_norm\": 0.6266234292162511,\n\
\ \"acc_norm_stderr\": 0.03344601323704533,\n \"mc1\": 0.47368421052631576,\n\
\ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.6319769000319811,\n\
\ \"mc2_stderr\": 0.0150681826970418\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6467576791808873,\n \"acc_stderr\": 0.013967822714840055,\n\
\ \"acc_norm\": 0.6663822525597269,\n \"acc_norm_stderr\": 0.013778687054176536\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6664011153156741,\n\
\ \"acc_stderr\": 0.0047053471376996185,\n \"acc_norm\": 0.8543118900617407,\n\
\ \"acc_norm_stderr\": 0.003520722505332094\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.03878139888797612,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.03878139888797612\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159798,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159798\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n\
\ \"acc_stderr\": 0.02489246917246283,\n \"acc_norm\": 0.7419354838709677,\n\
\ \"acc_norm_stderr\": 0.02489246917246283\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908234,\n \
\ \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n\
\ \"acc_stderr\": 0.016129271025099867,\n \"acc_norm\": 0.8293577981651377,\n\
\ \"acc_norm_stderr\": 0.016129271025099867\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n\
\ \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n\
\ \"acc_stderr\": 0.014248873549217575,\n \"acc_norm\": 0.8020434227330779,\n\
\ \"acc_norm_stderr\": 0.014248873549217575\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.02454761779480383,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.02454761779480383\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n\
\ \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n\
\ \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.02641560191438898,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.02641560191438898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729146,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729146\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n\
\ \"acc_stderr\": 0.012628393551811947,\n \"acc_norm\": 0.4256844850065189,\n\
\ \"acc_norm_stderr\": 0.012628393551811947\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47368421052631576,\n\
\ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.6319769000319811,\n\
\ \"mc2_stderr\": 0.0150681826970418\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936662\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6997725549658832,\n \
\ \"acc_stderr\": 0.01262542315228303\n }\n}\n```"
repo_url: https://huggingface.co/Intel/neural-chat-7b-v3-3-Slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|arc:challenge|25_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|gsm8k|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hellaswag|10_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T17-57-49.451204.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T17-57-49.451204.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- '**/details_harness|winogrande|5_2023-12-10T17-57-49.451204.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T17-57-49.451204.parquet'
- config_name: results
data_files:
- split: 2023_12_10T17_57_49.451204
path:
- results_2023-12-10T17-57-49.451204.parquet
- split: latest
path:
- results_2023-12-10T17-57-49.451204.parquet
---
# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3-3-Slerp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Intel/neural-chat-7b-v3-3-Slerp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Intel/neural-chat-7b-v3-3-Slerp](https://huggingface.co/Intel/neural-chat-7b-v3-3-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T17:57:49.451204](https://huggingface.co/datasets/open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3-Slerp/blob/main/results_2023-12-10T17-57-49.451204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6272160356239721,
"acc_stderr": 0.03276418695667091,
"acc_norm": 0.6266234292162511,
"acc_norm_stderr": 0.03344601323704533,
"mc1": 0.47368421052631576,
"mc1_stderr": 0.017479241161975526,
"mc2": 0.6319769000319811,
"mc2_stderr": 0.0150681826970418
},
"harness|arc:challenge|25": {
"acc": 0.6467576791808873,
"acc_stderr": 0.013967822714840055,
"acc_norm": 0.6663822525597269,
"acc_norm_stderr": 0.013778687054176536
},
"harness|hellaswag|10": {
"acc": 0.6664011153156741,
"acc_stderr": 0.0047053471376996185,
"acc_norm": 0.8543118900617407,
"acc_norm_stderr": 0.003520722505332094
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.03878139888797612,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.03878139888797612
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159798,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159798
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.02489246917246283,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.02489246917246283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.024635549163908234,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.024635549163908234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099867,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099867
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217575,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217575
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.02454761779480383,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.02454761779480383
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.01663583834163192,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.01663583834163192
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.02641560191438898,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.02641560191438898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729146,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811947,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811947
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421606,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421606
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47368421052631576,
"mc1_stderr": 0.017479241161975526,
"mc2": 0.6319769000319811,
"mc2_stderr": 0.0150681826970418
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936662
},
"harness|gsm8k|5": {
"acc": 0.6997725549658832,
"acc_stderr": 0.01262542315228303
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
acloudfan/wikismall-nsp | ---
license: apache-2.0
dataset_info:
features:
- name: id_1
dtype: string
- name: sentence_1
dtype: string
- name: id_2
dtype: string
- name: sentence_2
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 4155032
num_examples: 10000
download_size: 2933037
dataset_size: 4155032
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AnonymousPaperSubmissions/Testing_raw_data | ---
license: mit
---
|
AdapterOcean/med_alpaca_standardized_cluster_59 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 75122168
num_examples: 7748
download_size: 21890686
dataset_size: 75122168
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_59"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/PKDD_RoBERTa_FT | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115608885
num_examples: 37500
- name: test
num_bytes: 38536331
num_examples: 12500
download_size: 211881037
dataset_size: 154145216
---
# Dataset Card for "PKDD_RoBERTa_FT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
robertritz/mongolian_news | ---
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 678934529
num_examples: 136049
download_size: 302886208
dataset_size: 678934529
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- summarization
language:
- mn
pretty_name: Online Mongolian News
size_categories:
- 100K<n<1M
---
# Online Mongolian News Dataset
This dataset was scraped from an online news portal in Mongolia. It contains news stories and their headlines. It is ideal for a summarization task (making headlines from story content). |
Pav17/T3-gen-dataset-2 | ---
dataset_info:
features:
- name: task_id
dtype: int32
- name: text
dtype: string
- name: code
dtype: string
- name: test_list
sequence: string
- name: test_setup_code
dtype: string
- name: challenge_test_list
sequence: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 573358
num_examples: 374
- name: test
num_bytes: 778786
num_examples: 500
- name: validation
num_bytes: 137424
num_examples: 90
- name: prompt
num_bytes: 15176
num_examples: 10
download_size: 437403
dataset_size: 1504744
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
- split: prompt
path: data/prompt-*
---
|
mohammadhossein/SemEvalTask8_SubTaskAMono | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: model
dtype: string
- name: source
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 339239865
num_examples: 119757
- name: dev
num_bytes: 10543757
num_examples: 5000
download_size: 193819342
dataset_size: 349783622
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
---
|
Lollitor/FineTuneDatasetProtein | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: -logKd/Ki
dtype: float64
- name: inputs
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 7703267
num_examples: 11063
- name: validation
num_bytes: 852750
num_examples: 1230
download_size: 4201529
dataset_size: 8556017
---
# Dataset Card for "FineTuneDatasetProtein"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RuiqianLi/Li_singlish | ---
license: apache-2.0
---
training dataset:
Dataset({
features: ['id', 'audio', 'file', 'text'],
num_rows: 2700
})
{'id': '0',
'audio': {'path': '/root/.cache/huggingface/datasets/downloads/extracted/73016598ed29609d09a2c3c087d4e70e73dc549331efa2117aa6ec012d1ace35/singlish/train/0.wav', 'array': array([-9.1552734e-05, 2.7465820e-04, 8.2397461e-04, ...,
-1.3732910e-03, -3.9672852e-04, -7.6293945e-04], dtype=float32), 'sampling_rate': 16000},
'text':'a group of boys then challenged him to climb over the railing and stand on the parapet below'
'file':'/root/.cache/huggingface/datasets/downloads/extracted/73016598ed29609d09a2c3c087d4e70e73dc549331efa2117aa6ec012d1ace35/singlish/train/0.wav'
}
<class 'datasets.arrow_dataset.Dataset'> |
rjac/all-the-news-2-1-Component-one | ---
annotations_creators:
- Andrew Thompson
- components.one
language:
- en
---
# 2.7 million news articles and essays
## Table of Contents
- [Dataset Description](#dataset-description)
## Dataset Description
2.7 million news articles and essays from 27 American publications. Includes date, title, publication, article text, publication name, year, month, and URL (for some). Articles mostly span from 2016 to early 2020.
- Type: CSV
- Size: 3.4 GB compressed, 8.8 GB uncompressed
- Created by: Andrew Thompson
- Date added: 4/3/2020
- Date modified: 4/3/2020
- source: [Component one Datasets 2.7 Millions](https://components.one/datasets/all-the-news-2-news-articles-dataset)
- Date of Download and processed: 19/6/2022
- Header was modified with the respective columns
- Row number 2,324,812 was removed |
manhvh2601/DATN_20191956_Train | ---
license: apache-2.0
dataset_info:
features:
- name: STT
dtype: int64
- name: Name
dtype: string
- name: Audio
dtype: audio
- name: Text
dtype: string
splits:
- name: train
num_bytes: 271741516.75
num_examples: 1675
download_size: 238306910
dataset_size: 271741516.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_perlthoughts__Falkor-8x7B-MoE | ---
pretty_name: Evaluation run of perlthoughts/Falkor-8x7B-MoE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [perlthoughts/Falkor-8x7B-MoE](https://huggingface.co/perlthoughts/Falkor-8x7B-MoE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Falkor-8x7B-MoE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T21:48:58.361135](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Falkor-8x7B-MoE/blob/main/results_2023-12-16T21-48-58.361135.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6436792323952398,\n\
\ \"acc_stderr\": 0.0322025695700015,\n \"acc_norm\": 0.6451930910788192,\n\
\ \"acc_norm_stderr\": 0.03285268737187084,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5350238552317648,\n\
\ \"mc2_stderr\": 0.015383683041808177\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882419,\n\
\ \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902276\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6665006970722963,\n\
\ \"acc_stderr\": 0.004704996294145036,\n \"acc_norm\": 0.8503286197968533,\n\
\ \"acc_norm_stderr\": 0.0035601991854865575\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741702,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741702\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n\
\ \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n\
\ \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n\
\ \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n\
\ \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n\
\ \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\"\
: 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n\
\ \"acc_stderr\": 0.02513809138885111,\n \"acc_norm\": 0.3915343915343915,\n\
\ \"acc_norm_stderr\": 0.02513809138885111\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n\
\ \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726853,\n\
\ \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726853\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634353,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634353\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.01606005626853033,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.01606005626853033\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407003,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407003\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n\
\ \"acc_stderr\": 0.016175692013381954,\n \"acc_norm\": 0.37318435754189944,\n\
\ \"acc_norm_stderr\": 0.016175692013381954\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.02483605786829467,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.02483605786829467\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n\
\ \"acc_stderr\": 0.01275536872286394,\n \"acc_norm\": 0.4758800521512386,\n\
\ \"acc_norm_stderr\": 0.01275536872286394\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727682,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727682\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5350238552317648,\n\
\ \"mc2_stderr\": 0.015383683041808177\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.011201862744487048\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6072782410917361,\n \
\ \"acc_stderr\": 0.013451745349586569\n }\n}\n```"
repo_url: https://huggingface.co/perlthoughts/Falkor-8x7B-MoE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|arc:challenge|25_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|gsm8k|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hellaswag|10_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T21-48-58.361135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T21-48-58.361135.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- '**/details_harness|winogrande|5_2023-12-16T21-48-58.361135.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T21-48-58.361135.parquet'
- config_name: results
data_files:
- split: 2023_12_16T21_48_58.361135
path:
- results_2023-12-16T21-48-58.361135.parquet
- split: latest
path:
- results_2023-12-16T21-48-58.361135.parquet
---
# Dataset Card for Evaluation run of perlthoughts/Falkor-8x7B-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [perlthoughts/Falkor-8x7B-MoE](https://huggingface.co/perlthoughts/Falkor-8x7B-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Falkor-8x7B-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T21:48:58.361135](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Falkor-8x7B-MoE/blob/main/results_2023-12-16T21-48-58.361135.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6436792323952398,
"acc_stderr": 0.0322025695700015,
"acc_norm": 0.6451930910788192,
"acc_norm_stderr": 0.03285268737187084,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5350238552317648,
"mc2_stderr": 0.015383683041808177
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882419,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902276
},
"harness|hellaswag|10": {
"acc": 0.6665006970722963,
"acc_stderr": 0.004704996294145036,
"acc_norm": 0.8503286197968533,
"acc_norm_stderr": 0.0035601991854865575
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741702,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.02513809138885111,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.02513809138885111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726853,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634353,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.01606005626853033,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.01606005626853033
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407003,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407003
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.016175692013381954,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.016175692013381954
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.02483605786829467,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.02483605786829467
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.01275536872286394,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.01275536872286394
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727682,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727682
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5350238552317648,
"mc2_stderr": 0.015383683041808177
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.011201862744487048
},
"harness|gsm8k|5": {
"acc": 0.6072782410917361,
"acc_stderr": 0.013451745349586569
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
joytafty/icdar2023vqabd-small-tables-val | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: validation
num_bytes: 305631.0
num_examples: 19
download_size: 274240
dataset_size: 305631.0
---
# Dataset Card for "icdar2023vqabd-small-tables-val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jxm/robust04__gtr_base__dpr | ---
dataset_info:
features:
- name: text
dtype: string
- name: embeddings_A
sequence: float32
- name: embeddings_B
sequence: float32
splits:
- name: train
num_bytes: 922326096
num_examples: 100000
download_size: 906189329
dataset_size: 922326096
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lj_speech | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- unlicense
multilinguality:
- monolingual
paperswithcode_id: ljspeech
pretty_name: LJ Speech
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- automatic-speech-recognition
- text-to-speech
- text-to-audio
task_ids: []
train-eval-index:
- config: main
task: automatic-speech-recognition
task_id: speech_recognition
splits:
train_split: train
col_mapping:
file: path
text: text
metrics:
- type: wer
name: WER
- type: cer
name: CER
dataset_info:
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 22050
- name: file
dtype: string
- name: text
dtype: string
- name: normalized_text
dtype: string
config_name: main
splits:
- name: train
num_bytes: 4667022
num_examples: 13100
download_size: 2748572632
dataset_size: 4667022
---
# Dataset Card for lj_speech
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [The LJ Speech Dataset](https://keithito.com/LJ-Speech-Dataset/)
- **Repository:** [N/A]
- **Paper:** [N/A]
- **Leaderboard:** [Paperswithcode Leaderboard](https://paperswithcode.com/sota/text-to-speech-synthesis-on-ljspeech)
- **Point of Contact:** [Keith Ito](mailto:kito@kito.us)
### Dataset Summary
This is a public domain speech dataset consisting of 13,100 short audio clips of a single speaker reading passages from 7 non-fiction books in English. A transcription is provided for each clip. Clips vary in length from 1 to 10 seconds and have a total length of approximately 24 hours.
The texts were published between 1884 and 1964, and are in the public domain. The audio was recorded in 2016-17 by the LibriVox project and is also in the public domain.
### Supported Tasks and Leaderboards
The dataset can be used to train a model for Automatic Speech Recognition (ASR) or Text-to-Speech (TTS).
- `automatic-speech-recognition`: An ASR model is presented with an audio file and asked to transcribe the audio file to written text.
The most common ASR evaluation metric is the word error rate (WER).
- `text-to-speech`, `text-to-audio`: A TTS model is given a written text in natural language and asked to generate a speech audio file.
A reasonable evaluation metric is the mean opinion score (MOS) of audio quality.
The dataset has an active leaderboard which can be found at https://paperswithcode.com/sota/text-to-speech-synthesis-on-ljspeech
### Languages
The transcriptions and audio are in English.
## Dataset Structure
### Data Instances
A data point comprises the path to the audio file, called `file` and its transcription, called `text`.
A normalized version of the text is also provided.
```
{
'id': 'LJ002-0026',
'file': '/datasets/downloads/extracted/05bfe561f096e4c52667e3639af495226afe4e5d08763f2d76d069e7a453c543/LJSpeech-1.1/wavs/LJ002-0026.wav',
'audio': {'path': '/datasets/downloads/extracted/05bfe561f096e4c52667e3639af495226afe4e5d08763f2d76d069e7a453c543/LJSpeech-1.1/wavs/LJ002-0026.wav',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346,
0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 22050},
'text': 'in the three years between 1813 and 1816,'
'normalized_text': 'in the three years between eighteen thirteen and eighteen sixteen,',
}
```
Each audio file is a single-channel 16-bit PCM WAV with a sample rate of 22050 Hz.
### Data Fields
- id: unique id of the data sample.
- file: a path to the downloaded audio file in .wav format.
- audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- text: the transcription of the audio file.
- normalized_text: the transcription with numbers, ordinals, and monetary units expanded into full words.
### Data Splits
The dataset is not pre-split. Some statistics:
- Total Clips: 13,100
- Total Words: 225,715
- Total Characters: 1,308,678
- Total Duration: 23:55:17
- Mean Clip Duration: 6.57 sec
- Min Clip Duration: 1.11 sec
- Max Clip Duration: 10.10 sec
- Mean Words per Clip: 17.23
- Distinct Words: 13,821
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
This dataset consists of excerpts from the following works:
- Morris, William, et al. Arts and Crafts Essays. 1893.
- Griffiths, Arthur. The Chronicles of Newgate, Vol. 2. 1884.
- Roosevelt, Franklin D. The Fireside Chats of Franklin Delano Roosevelt. 1933-42.
- Harland, Marion. Marion Harland's Cookery for Beginners. 1893.
- Rolt-Wheeler, Francis. The Science - History of the Universe, Vol. 5: Biology. 1910.
- Banks, Edgar J. The Seven Wonders of the Ancient World. 1916.
- President's Commission on the Assassination of President Kennedy. Report of the President's Commission on the Assassination of President Kennedy. 1964.
Some details about normalization:
- The normalized transcription has the numbers, ordinals, and monetary units expanded into full words (UTF-8)
- 19 of the transcriptions contain non-ASCII characters (for example, LJ016-0257 contains "raison d'être").
- The following abbreviations appear in the text. They may be expanded as follows:
| Abbreviation | Expansion |
|--------------|-----------|
| Mr. | Mister |
| Mrs. | Misess (*) |
| Dr. | Doctor |
| No. | Number |
| St. | Saint |
| Co. | Company |
| Jr. | Junior |
| Maj. | Major |
| Gen. | General |
| Drs. | Doctors |
| Rev. | Reverend |
| Lt. | Lieutenant |
| Hon. | Honorable |
| Sgt. | Sergeant |
| Capt. | Captain |
| Esq. | Esquire |
| Ltd. | Limited |
| Col. | Colonel |
| Ft. | Fort |
(*) there's no standard expansion for "Mrs."
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
- The audio clips range in length from approximately 1 second to 10 seconds. They were segmented automatically based on silences in the recording. Clip boundaries generally align with sentence or clause boundaries, but not always.
- The text was matched to the audio manually, and a QA pass was done to ensure that the text accurately matched the words spoken in the audio.
#### Who are the annotators?
Recordings by Linda Johnson from LibriVox. Alignment and annotation by Keith Ito.
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
- The original LibriVox recordings were distributed as 128 kbps MP3 files. As a result, they may contain artifacts introduced by the MP3 encoding.
## Additional Information
### Dataset Curators
The dataset was initially created by Keith Ito and Linda Johnson.
### Licensing Information
Public Domain ([LibriVox](https://librivox.org/pages/public-domain/))
### Citation Information
```
@misc{ljspeech17,
author = {Keith Ito and Linda Johnson},
title = {The LJ Speech Dataset},
howpublished = {\url{https://keithito.com/LJ-Speech-Dataset/}},
year = 2017
}
```
### Contributions
Thanks to [@anton-l](https://github.com/anton-l) for adding this dataset. |
Rimyy/problemMath-llama5K | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3972845
num_examples: 5000
download_size: 1734972
dataset_size: 3972845
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Falah/fairy_girl_prompts_SDXL | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 471189912
num_examples: 1000000
download_size: 71160287
dataset_size: 471189912
---
# Dataset Card for "fairy_girl_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yao123/test | ---
license: apache-2.0
---
|
xDAN-datasets/ChatDoctor_chatdoctor_7k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: conversations_chatgpt
list:
- name: from
dtype: string
- name: value
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 14604774
num_examples: 7321
download_size: 8420745
dataset_size: 14604774
---
# Dataset Card for "ChatDoctor_chatdoctor_7k"
**数据集名称:**
*lavita/ChatDoctor-iCliniq*
**数据集原型来源:**
*https://huggingface.co/datasets/lavita/ChatDoctor-iCliniq*
**数据规模:**
*7.32k*
**数据生成:**
*由llm生成*
**数据领域:**
*医患对话*
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python3-standardized_cluster_10_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 10237854
num_examples: 4115
download_size: 0
dataset_size: 10237854
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_10_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gagan3012/arabic-ans-stance-pairwise | ---
dataset_info:
features:
- name: labels
sequence: int64
- name: sent1
sequence: string
- name: sent2
sequence: string
splits:
- name: train
num_bytes: 511126
num_examples: 1
- name: validation
num_bytes: 147950
num_examples: 1
- name: test
num_bytes: 73556
num_examples: 1
download_size: 296560
dataset_size: 732632
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
SujinHwang/criminal-sketch-Hr | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 182934925.572
num_examples: 8071
download_size: 166876827
dataset_size: 182934925.572
---
# Dataset Card for "criminal-sketch-Hr"
This data was created by processing the original dataset available [here](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&dataSetSn=618).
The original dataset contains text descriptions of the virtual personas and corresponding composite sketch images.
## Disclaimer
It's important to note that this dataset may not be ideal for training general-purpose text-to-image models. \
The fixed format of the text descriptions ("gender, age, face shape, hairstyle, eye features, nose features, ...") could lead to overfitting. \
In simpler terms, the model might become limited to generating images based on this specific format, but struggle with more diverse or natural language descriptions.
While acknowledging this limitation, I chose to proceed with this dataset due to the challenges associated with using natural language descriptions. \
Way too nuanced and diverse text data can be difficult for a model to learn from effectively, especially in a time-constrained training environment. \
To mitigate this, I employed a strategy of selecting specific bullet points from the original text file and merging them into grammatically correct sentences. \
This approach aimed to capture the essence of the natural descriptions while providing a more structured format for the model to learn from.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dearprakash/tamil_novels | ---
language:
- ta
---
license: cc-by-4.0
---
This dataset is an attempt to collect the list of work that are in the public domain or work that is shared under Creative Commons license
The novels are in Tamil language and the format is plain text format.
| Name of work | Source |
| - | - |
| கடலுக்கு அப்பால்-ப.சிங்காரம் | [அழிசி](https://www.azhisi.in/) |
| புயலிலே ஒரு தோணி - ப.சிங்காரம் | [அழிசி](https://www.azhisi.in/) |
| சத்திய சோதனை-மகாத்மா காந்தி | [அழிசி](https://www.azhisi.in/) |
| நவகாளி யாத்திரை-சாவி | [அழிசி](https://www.azhisi.in/) |
Work from the following will be added soon
[படைப்புகள் நாட்டுடைமை ஆக்கப்பட்ட அறிஞர்கள் பட்டியல் (2022 வரை)](https://tamil.wiki/wiki/%E0%AE%A8%E0%AF%82%E0%AE%B2%E0%AF%8D%E0%AE%95%E0%AE%B3%E0%AF%8D_%E0%AE%A8%E0%AE%BE%E0%AE%9F%E0%AF%8D%E0%AE%9F%E0%AF%81%E0%AE%9F%E0%AF%88%E0%AE%AE%E0%AF%88)
|எண்| அறிஞர்கள் | நாட்டுடைமையாக்கப்பட்ட ஆண்டு |
|-|-|-|
| 1 | பாரதியார் | 1967-க்கு முன் |
| 2 | சிலம்புச் செல்வர் ம.பொ.சி | 1984, 2006 |
| 3| பாவேந்தர் பாரதிதாசன் |1990|
| 4| சி.என்.அண்ணாத்துரை |1995|
| 5| பட்டுக்கோட்டை கல்யாணசுந்தரம் |1995|
| 6| தேவநேயப் பாவாணர் |1996|
| 7| மறைமலையடிகள் |1997|
| 8| திரு வி. கல்யாணசுந்தர முதலியார் |1998|
| 9| கல்கி கிருஷ்ணமூர்த்தி |1998|
| 10| கவிமணி தேசிக விநாயகம் பிள்ளை |1998|
| 11| ப. ஜீவானந்தம் |1998|
| 12| நாமக்கல் கவிஞர் வெ. இராமலிங்கம் பிள்ளை |1998|
| 13| வ.உ. சிதம்பரம் பிள்ளை |1998|
| 14| சுத்தானந்த பாரதியார் |1998|
| 15| ஏ.எஸ்.கே. ஐயங்கார் |1998|
| 16| வ. ராமசாமி ஐயங்கார் |1998|
| 17| நாவலர் சோமசுந்தர பாரதியார் |1998|
| 18| கவி கா.மு. ஷெரீப் |1998|
| 19| பரலி சு. நெல்லையப்பர் |1998|
| 20| வ.வே.சு. ஐயர் |1998|
| 21| சா. கணேசன் |1998|
| 22| ச.து.சு. யோகி |1998|
| 23| வெ. சாமிநாத சர்மா |2000|
| 24| கவிஞர் முடியரசன் |2000|
| 25| மயிலை சீனி வேங்கடசாமி |2000|
| 26| சாமி சிதம்பரனார் |2000|
| 27| கா. அப்பாத்துரை |2001|
| 28| புதுமைப்பித்தன் |2002|
| 29| கு.ப.சேது அம்மாள் |2002|
| 30| நாவலர் பண்டித ந.மு.வேங்கடசாமிநாட்டார் |2004|
| 31| க. நா. சுப்பிரமணியம் |2004|
| 32| ந. பிச்சமூர்த்தி |2004|
| 33| புலவர் குழந்தை |2006|
| 34| பரிதிமாற் கலைஞர் வி.கோ. சூரியநாராயண சாஸ்திரியார் |2006|
| 35| கா.சுப்பிரமணியப் பிள்ளை |2007|
| 36| புலவர் குலாம் காதிறு நாவலர் |2007|
| 37| தி.வை.சதாசிவப் பண்டாரத்தார் |2007|
| 38| டாக்டர். சி. இலக்குவனார் |2007|
| 39| எம். தண்டபாணி தேசிகர் |2007|
| 40| தி.ஜ. ரங்கநாதன் (தி.ஜ.ர) |2007|
| 41| நாரண துரைக்கண்ணன் |2007|
| 42| டாக்டர் மா. இராசமாணிக்கனார் |2007|
| 43| டாக்டர் வ.சு.ப. மாணிக்கம் |2007|
| 44| புலவர் கா. கோவிந்தன் |2007|
| 45| சக்தி வை. கோவிந்தன் |2007|
| 46| தெ.பொ. மீனாட்சி சுந்தரனார் |2007|
| 47| த.நா. குமாரசாமி |2007|
| 48| மாயூரம் வேதநாயகம் பிள்ளை |2007|
| 49| ம. சிங்காரவேலர் |2007|
| 50| குன்றக்குடி அடிகளார் |2007|
| 51| கி.ஆ.பெ. விசுவநாதம் |2007|
| 52| கி.வா. ஜகன்னாதன் |2007|
| 53| சு. துரைசாமி பிள்ளை |2007|
| 54| அ.ச. ஞானசம்பந்தனார் |2007|
| 55| திருக்குறளார் முனுசாமி |2007|
| 56| உவமைக்கவிஞர் சுரதா |2007|
| 57| சாவி |2007|
| 58| மாவெண்கோ என்ற வ.கோ. சண்முகம் |2007|
| 59| தீபம் நா. பார்த்தசாரதி |2007|
| 60| எஸ்.எஸ். தென்னரசு |2007|
| 61| சி.பி. சிற்றரசு |2007|
| 62| ஏ.வி.பி. ஆசைத்தம்பி |2007|
| 63| டி.கே. சீனிவாசன் |2007|
| 64| இராம. அரங்கண்ணல் |2007|
| 65| கவிஞர் வாணிதாசன் |2007|
| 66| கவிஞர் கருணானந்தம் |2007|
| 67| மருதகாசி |2007|
| 68| ஜலகண்டபுரம் ப. கண்ணன் |2007|
| 69| கவிஞர் பெரியசாமித்தூரன் |2008|
| 70| பேராசிரியர் க. வெள்ளைவாரணனார் |2008|
| 71| பண்டித க. அயோத்திதாசர் |2008|
| 72| ஆபிரகாம் பண்டிதர் |2008|
| 73| சதாவதானி செய்குத்தம்பிப் பாவலர் |2008|
| 74| டாக்டர் ரா.பி. சேதுப்பிள்ளை |2008|
| 75| ரா. ராகவையங்கார் |2008|
| 76| உடுமலை நாராயண கவி |2008|
| 77| கு.மு. அண்ணல்தங்கோ |2008|
| 78| அவ்வை தி.க. சண்முகம் |2008|
| 79| விந்தன் |2008|
| 80| லா.ச.ராமாமிர்தம் |2008|
| 81| வல்லிக்கண்ணன் |2008|
| 82| நா. வானமாமலை |2008|
| 83| கவிஞர் புதுவைச் சிவம் |2008|
| 84| அ. இராகவன் |2008|
| 85| தொ.மு.சி. ரகுநாதன் |2008|
| 86| சக்திதாசன் சுப்பிரமணியன் |2008|
| 87| டாக்டர் ந. சஞ்சீவி |2008|
| 88| முல்லை முத்தையா |2008|
| 89| கவிஞர் எஸ்.டி. சுந்தரம் |2008|
| 90| கவிஞர் மீரா |2008|
| 91| ஆ. கார்மேகக் கோனார் |2008|
| 92| புலவர் முகமது நயினார் மரைக்காயர் |2008|
| 93| சு. சமுத்திரம் |2008|
| 94| கோவை இளஞ்சேரன் |2008|
| 95| பேராசிரியர் ந. சுப்புரெட்டியார் |2008|
| 96| பாவலரேறு பெருஞ்சித்திரனார் |2008|
| 97| அழ. வள்ளியப்பா |2009|
| 98| பண்டிதமணி மு. கதிரேசன் செட்டியார் |2009|
| 99| பம்மல் சம்பந்த முதலியார் |2009|
| 100| அ. சிதம்பரநாதன் செட்டியார் |2009|
| 101| மு.சி. பூர்ணலிங்கம் பிள்ளை |2009|
| 102| தொ.மு. பாஸ்கரத் தொண்டைமான் |2009|
| 103| பாலூர் கண்ணப்ப முதலியார் |2009|
| 104| ச. அகத்தியலிங்கம் |2009|
| 105| பாவலர் நாரா. நாச்சியப்பன் |2009|
| 106| புலியூர்க் கேசிகன் |2009|
| 107| வை.மு. கோதைநாயகி |2009|
| 108| சின்ன அண்ணாமலை |2009|
| 109| என்.வி. கலைமணி |2009|
| 110| கவிஞர் முருகு சுந்தரம் |2009|
| 111| புலவர் த. கோவேந்தன் |2009|
| 112| அ.க. நவநீதகிருட்டிணன் |2009|
| 113| வடுவூர் கே. துரைசாமி ஐயங்கார் |2009|
| 114| பேரா.மு. ராகவையங்கார் |2009|
| 115| பூவை.எஸ். ஆறுமுகம் |2009|
| 116| பேரா. வையாபுரிப்பிள்ளை |2009|
| 117| ராய சொக்கலிங்கன் |2009|
| 118| ராஜம் கிருஷ்ணன் |2009|
| 119| மணவை முஸ்தபா |2010|
| 120| பேரா. அ.மு. பரமசிவானந்தம் |2010|
| 121| பேரா. அ. கிருஷ்ணமூர்த்தி |2010|
| 122| பேரா. எஸ். எம். கமால் |2010|
| 123| ப. ராமசாமி |2010|
| 124| பேரா. ரா. சீனிவாசன் |2010|
| 125| வ.சு. செங்கல்வராய பிள்ளை |2010|
| 126| கவிஞர் வெள்ளியங்காட்டான் |2010|
| 127| நெ.து. சுந்தரவடிவேலு |2010|
| 128| டாக்டர் சி. பாலசுப்பிரமணியன் |2010|
| 129| மயிலை சிவமுத்து |2010|
| 130| காழி சிவகண்ணுசாமி பிள்ளை |2010|
| 131| கே.பி.நீலமணி |2010|
| 132| கவிராஜ பண்டிதர் ஜெகவீர பாண்டியன் |2010|
| 133| அ. திருமலை முத்துசாமி |2010|
| 134| எஸ். நவராஜ் செல்லையா |2010|
| 135| பொ. திரிகூட சுந்தரம் பிள்ளை |2010|
| 136| பேரா. சுந்தர சண்முகனார் |2010|
| 137| தஞ்சை ராமையாதாஸ் |2010|
| 138| கவிஞர் தாராபாரதி |2010|
| 139| சரோஜா ராமமூர்த்தி |2010|
| 140| அ. சீனிவாசன் |2010|
| 141| ரசிகமணி டி.கே. சிதம்பரநாத முதலியார் |2010|
| 142| ஜே.ஆர். ரங்கராஜு |2010|
| 143| ஏ.கே. வேலன் |2010|
| 144| பேரா. கு. சீனிவாசன் |2010|
| 145| கு.சா. கிருஷ்ணமூர்த்தி |2011|
| 146| கா.ம. வேங்கடராமையா |2011|
| 147| முனைவர் மு.தமிழ்க்குடிமகன் |2018|
| 148| மேலாண்மை பொன்னுச்சாமி |2018|
| 149| முனைவர் பொன்.சவுரிராசன் |2018|
| 150| உளுந்தூர்பேட்டை சண்முகம் |2019|
| 151| கவிஞர் நா. காமராசன் |2019|
| 152| முனைவர் இரா.இளவரசு |2019|
| 153| அடிகளாசிரியர் |2019|
| 154| புலவர் இறைக்குருவனார் |2019|
| 155| பண்டித ம. கோபாலகிருட்டிணன் |2019|
| 156| பாபநாசம் குறள்பித்தன் |2019|
| 157| சிலம்பொலி சு. செல்லப்பன் |2021|
| 158| முனைவர் தொ.பரமசிவன் |2021|
| 159| இரா. இளங்குமரனார் |2021|
| 160| முருகேச பாகவதர் |2021|
| 161| சங்கரவள்ளி நாயகம் |2021|
| 162| புலவர் செ. இராசு |2021|
| 163| பேராசிரியர் க. அன்பழகன் |2021|
| 164| முனைவர் நாவலர் இரா. நெடுஞ்செழியன் |2021|
| 165| நெல்லை கண்ணன் |2022|
| 166| கந்தர்வன் |2022|
| 167| சோமலெ |2022|
| 168| தஞ்சை பிரகாஷ் |2022|
| 169| செ. திவான் |2022|
| 170| நா. மம்மது |2022|
| 171| விடுதலை ராசேந்திரன் |2022|
| 172| முனைவர் த. ராசையா |2022|
|
CyberHarem/micaiah_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of micaiah (Fire Emblem)
This is the dataset of micaiah (Fire Emblem), containing 500 images and their tags.
The core tags of this character are `long_hair, yellow_eyes, bangs, grey_hair, ribbon, hair_ribbon, half_updo, breasts, white_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 769.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/micaiah_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 397.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/micaiah_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1237 | 854.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/micaiah_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 661.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/micaiah_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1237 | 1.22 GiB | [Download](https://huggingface.co/datasets/CyberHarem/micaiah_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/micaiah_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 30 |  |  |  |  |  | official_alternate_costume, red_bikini, 1girl, solo, bare_shoulders, cleavage, hair_flower, navel, bikini_skirt, blue_scarf, looking_at_viewer, red_gloves, collarbone, open_mouth, bird, blush, groin, :d, front-tie_bikini_top, towel, cowboy_shot, simple_background, miniskirt, outdoors, sky, fingerless_gloves, water |
| 1 | 6 |  |  |  |  |  | 1girl, blush, nipples, solo, collarbone, groin, looking_at_viewer, navel, pussy, smile, simple_background, ass_visible_through_thighs, completely_nude, white_background |
| 2 | 7 |  |  |  |  |  | 1girl, bangle, bare_shoulders, belt, black_gloves, black_pantyhose, blue_scarf, elbow_gloves, fingerless_gloves, side_slit, simple_background, sleeveless_dress, solo, bird, boots, smile, white_background |
| 3 | 5 |  |  |  |  |  | 1girl, bangle, bare_shoulders, black_gloves, black_pantyhose, blue_scarf, cowboy_shot, elbow_gloves, fingerless_gloves, side_slit, simple_background, sleeveless_dress, solo, white_background, belt, looking_at_viewer, smile, blush, hand_on_own_chest |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, blue_scarf, elbow_gloves, fingerless_gloves, simple_background, sleeveless_dress, solo, upper_body, bangle, smile, white_background, bird_on_hand |
| 5 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blue_cape, simple_background, sleeveless_dress, solo, turtleneck_dress, bangle, looking_at_viewer, smile, elbow_gloves, fingerless_gloves, black_pantyhose |
| 6 | 21 |  |  |  |  |  | 1girl, bare_shoulders, solo, jewelry, looking_at_viewer, sleeveless_dress, smile, official_alternate_costume, simple_background, turtleneck_dress, upper_body, white_background, white_dress, flower, wedding_dress, blush, bouquet, holding, open_mouth, white_gloves |
| 7 | 6 |  |  |  |  |  | 1girl, bangle, bare_shoulders, black_dress, black_gloves, bridal_gauntlets, circlet, official_alternate_costume, side_slit, sleeveless_dress, solo, turtleneck_dress, smile, earrings, elbow_gloves, fur-trimmed_coat, looking_at_viewer, red_cape, cowboy_shot, full_body, red_coat, simple_background |
| 8 | 9 |  |  |  |  |  | 1girl, bare_shoulders, elbow_gloves, gradient_clothes, official_alternate_costume, shiny_clothes, solo, black_gloves, short_dress, looking_at_viewer, simple_background, sleeveless_dress, torn_cape, bird, hair_bow, bangle, black_dress, grey_background, pantyhose, shiny_hair, skirt, black_ribbon, smile, thigh_boots, turtleneck |
| 9 | 5 |  |  |  |  |  | 1girl, bare_shoulders, circlet, long_sleeves, red_cape, solo, official_alternate_costume, simple_background, white_background, bangle, bridal_gauntlets, detached_sleeves, full_body, open_mouth, sandals, smile, turtleneck_dress, white_dress, magic, sleeveless_dress |
| 10 | 5 |  |  |  |  |  | 1boy, 1girl, blue_scarf, blush, hetero, mosaic_censoring, penis, solo_focus, bare_shoulders, cum_in_mouth, fellatio, from_side, sleeveless_dress, upper_body, brick_wall, gloves, heart, nipples, nude, pink_background, profile, simple_background, smile, tears |
| 11 | 30 |  |  |  |  |  | 1girl, blush, nipples, 1boy, hetero, sex, solo_focus, open_mouth, vaginal, navel, penis, sweat, spread_legs, collarbone, pov, smile, large_breasts, looking_at_viewer, completely_nude, mosaic_censoring, cum_in_pussy, cowgirl_position, bed_sheet, birthmark, on_back |
| 12 | 16 |  |  |  |  |  | 1girl, yukata, butterfly_print, official_alternate_costume, solo, blush, obi, looking_at_viewer, smile, wide_sleeves, simple_background, upper_body, holding, open_mouth, twitter_username, white_background |
| 13 | 7 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, cosplay, alternate_costume, blue_cape, bodystocking, covered_navel, simple_background, skin_tight, smile, bracelet, white_background, bridal_gauntlets, full_body, open_book, open_mouth |
| 14 | 9 |  |  |  |  |  | 1girl, cleavage, crop_top, looking_at_viewer, midriff, navel, short_shorts, smile, tied_shirt, alternate_costume, blush, checkered_shirt, collarbone, denim_shorts, short_sleeves, solo, beer_mug, front-tie_top, holding_cup, large_breasts, blue_shorts, no_gloves, plaid, twitter_username, cowboy_shot |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | official_alternate_costume | red_bikini | 1girl | solo | bare_shoulders | cleavage | hair_flower | navel | bikini_skirt | blue_scarf | looking_at_viewer | red_gloves | collarbone | open_mouth | bird | blush | groin | :d | front-tie_bikini_top | towel | cowboy_shot | simple_background | miniskirt | outdoors | sky | fingerless_gloves | water | nipples | pussy | smile | ass_visible_through_thighs | completely_nude | white_background | bangle | belt | black_gloves | black_pantyhose | elbow_gloves | side_slit | sleeveless_dress | boots | hand_on_own_chest | upper_body | bird_on_hand | blue_cape | turtleneck_dress | jewelry | white_dress | flower | wedding_dress | bouquet | holding | white_gloves | black_dress | bridal_gauntlets | circlet | earrings | fur-trimmed_coat | red_cape | full_body | red_coat | gradient_clothes | shiny_clothes | short_dress | torn_cape | hair_bow | grey_background | pantyhose | shiny_hair | skirt | black_ribbon | thigh_boots | turtleneck | long_sleeves | detached_sleeves | sandals | magic | 1boy | hetero | mosaic_censoring | penis | solo_focus | cum_in_mouth | fellatio | from_side | brick_wall | gloves | heart | nude | pink_background | profile | tears | sex | vaginal | sweat | spread_legs | pov | large_breasts | cum_in_pussy | cowgirl_position | bed_sheet | birthmark | on_back | yukata | butterfly_print | obi | wide_sleeves | twitter_username | cosplay | alternate_costume | bodystocking | covered_navel | skin_tight | bracelet | open_book | crop_top | midriff | short_shorts | tied_shirt | checkered_shirt | denim_shorts | short_sleeves | beer_mug | front-tie_top | holding_cup | blue_shorts | no_gloves | plaid |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------|:-------------|:--------|:-------|:-----------------|:-----------|:--------------|:--------|:---------------|:-------------|:--------------------|:-------------|:-------------|:-------------|:-------|:--------|:--------|:-----|:-----------------------|:--------|:--------------|:--------------------|:------------|:-----------|:------|:--------------------|:--------|:----------|:--------|:--------|:-----------------------------|:------------------|:-------------------|:---------|:-------|:---------------|:------------------|:---------------|:------------|:-------------------|:--------|:--------------------|:-------------|:---------------|:------------|:-------------------|:----------|:--------------|:---------|:----------------|:----------|:----------|:---------------|:--------------|:-------------------|:----------|:-----------|:-------------------|:-----------|:------------|:-----------|:-------------------|:----------------|:--------------|:------------|:-----------|:------------------|:------------|:-------------|:--------|:---------------|:--------------|:-------------|:---------------|:-------------------|:----------|:--------|:-------|:---------|:-------------------|:--------|:-------------|:---------------|:-----------|:------------|:-------------|:---------|:--------|:-------|:------------------|:----------|:--------|:------|:----------|:--------|:--------------|:------|:----------------|:---------------|:-------------------|:------------|:------------|:----------|:---------|:------------------|:------|:---------------|:-------------------|:----------|:--------------------|:---------------|:----------------|:-------------|:-----------|:------------|:-----------|:----------|:---------------|:-------------|:------------------|:---------------|:----------------|:-----------|:----------------|:--------------|:--------------|:------------|:--------|
| 0 | 30 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | | | X | X | | | | X | | | X | | X | | | X | X | | | | | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | | | X | X | X | | | | | X | | | | | X | | | | | | | X | | | | X | | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | | | X | X | X | | | | | X | X | | | | | X | | | | | X | X | | | | X | | | | X | | | X | X | X | X | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | | | X | X | X | | | | | X | | | | | | | | | | | | X | | | | X | | | | X | | | X | X | | X | | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | | | X | X | X | | | | | | X | | | | | | | | | | | X | | | | X | | | | X | | | | X | | | X | X | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 21 |  |  |  |  |  | X | | X | X | X | | | | | | X | | | X | | X | | | | | | X | | | | | | | | X | | | X | | | | | | | X | | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | X | X | X | | | | | | X | | | | | | | | | | X | X | | | | | | | | X | | | | X | | X | | X | X | X | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | | X | X | X | | | | | | X | | | | X | | | | | | | X | | | | | | | | X | | | | X | | X | | X | | X | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | X | X | X | | | | | | | | | X | | | | | | | | X | | | | | | | | X | | | X | X | | | | | | X | | | | | | X | | X | | | | | | | X | X | | | X | X | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | | | X | | X | | | | | X | | | | | | X | | | | | | X | | | | | | X | | X | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 30 |  |  |  |  |  | | | X | | | | | X | | | X | | X | X | | X | | | | | | | | | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 16 |  |  |  |  |  | X | | X | X | | | | | | | X | | | X | | X | | | | | | X | | | | | | | | X | | | X | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 13 | 7 |  |  |  |  |  | | | X | X | | X | | | | | X | | | X | | | | | | | | X | | | | | | | | X | | | X | | | | | | | | | | | | X | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 14 | 9 |  |  |  |  |  | | | X | X | | X | | X | | | X | | X | | | X | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
flavienbwk/example-dummy-evaluation | ---
license: mit
---
|
cybfi/cyber-2007-unrated | ---
license: mit
---
|
Vanmas/PoE_data | ---
license: cc
---
|
davanstrien/quickstart-10-test | Invalid username or password. |
ezzaldeen/AraNovels | ---
license: apache-2.0
language:
- ar
size_categories:
- 1M<n<10M
---
# AraBooks
A dataset comprising OCR-scanned Arabic language content from a vast collection of books, intended to bolster development and research in the Arabic languages. It includes texts from a variety of sources, some translated into Arabic while others originally authored in the language.
This dataset may contain noise due to the low quality of PDF files.
## Stats behind dataset:
- total number of tokens (all books combined): ~ **555,643 tokens**.
**Disclaimer:**
The dataset provided herein consists of scanned books and is intended strictly for research and educational purposes. The content of these scanned books may be subject to copyright protection and other intellectual property rights.
Users of this dataset bear sole responsibility for ensuring compliance with applicable copyright laws and regulations. The creator of this dataset offers no assurances regarding the accuracy, completeness, or suitability of the data for any specific purpose.
Moreover, the creator of this dataset disclaims any liability for loss, damage, or legal repercussions resulting from the use or misuse of the provided data.
By accessing and utilizing this dataset, you agree to indemnify and absolve the creators of the dataset from any claims, damages, or liabilities arising from your use of the data. |
gryffindor-ISWS/fictional_characters_raw_data_without_images | ---
license: gpl-3.0
task_categories:
- text-to-image
language:
- en
tags:
- art
size_categories:
- 1K<n<10K
--- |
Vezora/10k-Python-2048-Max | ---
license: apache-2.0
---
|
distilled-from-one-sec-cv12/chunk_89 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1319970928
num_examples: 257204
download_size: 1347539891
dataset_size: 1319970928
---
# Dataset Card for "chunk_89"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Typly/the_typly_email_dataset | ---
license: mit
language:
- pl
pretty_name: The Typly Email Dataset
---
The Typly Email Dataset contains 5,000 emails in Polish, collected from offices. The messages have been anonymised and pre-processed by [Typly](https://typly.app/). |
open-llm-leaderboard/details_abhinand__tamil-llama-7b-instruct-v0.2 | ---
pretty_name: Evaluation run of abhinand/tamil-llama-7b-instruct-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhinand/tamil-llama-7b-instruct-v0.2](https://huggingface.co/abhinand/tamil-llama-7b-instruct-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhinand__tamil-llama-7b-instruct-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T18:30:45.482735](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__tamil-llama-7b-instruct-v0.2/blob/main/results_2024-01-23T18-30-45.482735.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.243075402543886,\n\
\ \"acc_stderr\": 0.030069028919401566,\n \"acc_norm\": 0.24181008544813296,\n\
\ \"acc_norm_stderr\": 0.030751648835495787,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155055,\n \"mc2\": 0.5003889364770407,\n\
\ \"mc2_stderr\": 0.015377822106726793\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3967576791808874,\n \"acc_stderr\": 0.014296513020180646,\n\
\ \"acc_norm\": 0.40187713310580203,\n \"acc_norm_stderr\": 0.01432726861457827\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5311690898227445,\n\
\ \"acc_stderr\": 0.00498007670739244,\n \"acc_norm\": 0.6883091017725552,\n\
\ \"acc_norm_stderr\": 0.004622376674166701\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.30599755201958384,\n \"mc1_stderr\": 0.016132229728155055,\n\
\ \"mc2\": 0.5003889364770407,\n \"mc2_stderr\": 0.015377822106726793\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.6677190213101816,\n\
\ \"acc_stderr\": 0.013238316554236521\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.05534495830174375,\n \"acc_stderr\": 0.006298221796179564\n\
\ }\n}\n```"
repo_url: https://huggingface.co/abhinand/tamil-llama-7b-instruct-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|arc:challenge|25_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|arc:challenge|25_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|gsm8k|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|gsm8k|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hellaswag|10_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hellaswag|10_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T15-20-33.725071.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T18-30-45.482735.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T18-30-45.482735.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- '**/details_harness|winogrande|5_2024-01-23T15-20-33.725071.parquet'
- split: 2024_01_23T18_30_45.482735
path:
- '**/details_harness|winogrande|5_2024-01-23T18-30-45.482735.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T18-30-45.482735.parquet'
- config_name: results
data_files:
- split: 2024_01_23T15_20_33.725071
path:
- results_2024-01-23T15-20-33.725071.parquet
- split: 2024_01_23T18_30_45.482735
path:
- results_2024-01-23T18-30-45.482735.parquet
- split: latest
path:
- results_2024-01-23T18-30-45.482735.parquet
---
# Dataset Card for Evaluation run of abhinand/tamil-llama-7b-instruct-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhinand/tamil-llama-7b-instruct-v0.2](https://huggingface.co/abhinand/tamil-llama-7b-instruct-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhinand__tamil-llama-7b-instruct-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T18:30:45.482735](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__tamil-llama-7b-instruct-v0.2/blob/main/results_2024-01-23T18-30-45.482735.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.243075402543886,
"acc_stderr": 0.030069028919401566,
"acc_norm": 0.24181008544813296,
"acc_norm_stderr": 0.030751648835495787,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155055,
"mc2": 0.5003889364770407,
"mc2_stderr": 0.015377822106726793
},
"harness|arc:challenge|25": {
"acc": 0.3967576791808874,
"acc_stderr": 0.014296513020180646,
"acc_norm": 0.40187713310580203,
"acc_norm_stderr": 0.01432726861457827
},
"harness|hellaswag|10": {
"acc": 0.5311690898227445,
"acc_stderr": 0.00498007670739244,
"acc_norm": 0.6883091017725552,
"acc_norm_stderr": 0.004622376674166701
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155055,
"mc2": 0.5003889364770407,
"mc2_stderr": 0.015377822106726793
},
"harness|winogrande|5": {
"acc": 0.6677190213101816,
"acc_stderr": 0.013238316554236521
},
"harness|gsm8k|5": {
"acc": 0.05534495830174375,
"acc_stderr": 0.006298221796179564
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_TehVenom__DiffMerge-DollyGPT-Pygmalion | ---
pretty_name: Evaluation run of TehVenom/DiffMerge-DollyGPT-Pygmalion
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TehVenom/DiffMerge-DollyGPT-Pygmalion](https://huggingface.co/TehVenom/DiffMerge-DollyGPT-Pygmalion)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__DiffMerge-DollyGPT-Pygmalion\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T02:27:34.673978](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__DiffMerge-DollyGPT-Pygmalion/blob/main/results_2023-09-17T02-27-34.673978.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03261325503355705,\n\
\ \"em_stderr\": 0.0018190171380944463,\n \"f1\": 0.06326342281879199,\n\
\ \"f1_stderr\": 0.0020903684000438045,\n \"acc\": 0.2691397000789266,\n\
\ \"acc_stderr\": 0.007005621297482058\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.03261325503355705,\n \"em_stderr\": 0.0018190171380944463,\n\
\ \"f1\": 0.06326342281879199,\n \"f1_stderr\": 0.0020903684000438045\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5382794001578532,\n\
\ \"acc_stderr\": 0.014011242594964116\n }\n}\n```"
repo_url: https://huggingface.co/TehVenom/DiffMerge-DollyGPT-Pygmalion
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T02_27_34.673978
path:
- '**/details_harness|drop|3_2023-09-17T02-27-34.673978.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T02-27-34.673978.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T02_27_34.673978
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-27-34.673978.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-27-34.673978.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:25.524586.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:29:25.524586.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:29:25.524586.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T02_27_34.673978
path:
- '**/details_harness|winogrande|5_2023-09-17T02-27-34.673978.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T02-27-34.673978.parquet'
- config_name: results
data_files:
- split: 2023_07_19T19_29_25.524586
path:
- results_2023-07-19T19:29:25.524586.parquet
- split: 2023_09_17T02_27_34.673978
path:
- results_2023-09-17T02-27-34.673978.parquet
- split: latest
path:
- results_2023-09-17T02-27-34.673978.parquet
---
# Dataset Card for Evaluation run of TehVenom/DiffMerge-DollyGPT-Pygmalion
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TehVenom/DiffMerge-DollyGPT-Pygmalion
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TehVenom/DiffMerge-DollyGPT-Pygmalion](https://huggingface.co/TehVenom/DiffMerge-DollyGPT-Pygmalion) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TehVenom__DiffMerge-DollyGPT-Pygmalion",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T02:27:34.673978](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__DiffMerge-DollyGPT-Pygmalion/blob/main/results_2023-09-17T02-27-34.673978.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03261325503355705,
"em_stderr": 0.0018190171380944463,
"f1": 0.06326342281879199,
"f1_stderr": 0.0020903684000438045,
"acc": 0.2691397000789266,
"acc_stderr": 0.007005621297482058
},
"harness|drop|3": {
"em": 0.03261325503355705,
"em_stderr": 0.0018190171380944463,
"f1": 0.06326342281879199,
"f1_stderr": 0.0020903684000438045
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5382794001578532,
"acc_stderr": 0.014011242594964116
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
chenxwh/gen-xcopa | ---
license: cc-by-4.0
---
|
autoevaluate/autoeval-staging-eval-project-cfd9b2d6-f835-45b3-a940-6a4a4aec71b0-122118 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- autoevaluate/conll2003-sample
eval_info:
task: entity_extraction
model: autoevaluate/entity-extraction-not-evaluated
metrics: []
dataset_name: autoevaluate/conll2003-sample
dataset_config: autoevaluate--conll2003-sample
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: autoevaluate/entity-extraction-not-evaluated
* Dataset: autoevaluate/conll2003-sample
* Config: autoevaluate--conll2003-sample
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
mfee/nakedbibi | ---
license: openrail
---
|
zh-tw-llm-dv/zh-tw-pythia-ta8000-v1-e1-tr_sg-001 | ---
dataset_info:
dataset_size: 355884981.0
download_size: 134906094
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- dtype: string
name: preview
splits:
- name: train
num_bytes: 354002017.0
num_examples: 206319
- name: test
num_bytes: 1882964.0
num_examples: 300
---
# zh-tw-pythia-ta8000-v1-e1-tr_wiki_sg-001
This dataset is a part of the `zh-tw-llm` project.
* Tokenizer: `zh-tw-pythia-tokenizer-a8000-v1`
* Built with: `translations`, `sharegpt`
* Rows: `train` `206319`, `test` `300`
* Max length: `2048`
* Full config:
```json
{"build_with": ["translations", "sharegpt"], "preview_length": 128, "translations_settings": {"source_dataset": "zetavg/coct-en-zh-tw-translations-twp-300k", "lang_1_key": "en", "lang_2_key": "ch", "templates": ["English: {lang_1}\nChinese: {lang_2}", "Chinese: {lang_2}\nEnglish: {lang_1}"], "rows_limit": 100000, "test_size": 100, "test_split_seed": 42}, "sharegpt_settings": {"source_dataset": "zetavg/ShareGPT-Processed", "train_on_inputs": false, "languages": [{"en": 0.4}, "zh_Hant"], "rows_limit": 8000, "test_size": 0.02, "test_split_seed": 42, "test_rows_limit": 100}}
``` |
zirui3/multi-turn-med-dialog | ---
license: mit
---
# data summary
Multi-Turn Medical diaglogue of chinese & english
# samples
zh: 1.5k
```json
{
"dialog_id": 3,
"thread": [
{
"turn": 0,
"text": "眼睛白内障内膜薄怎么办(女,82岁)",
"role": "patient"
},
{
"turn": 1,
"text": "您好,您为什么要咨询内膜呢?",
"role": "doctor"
},
{
"turn": 2,
"text": "是怕晶状体囊膜太薄放不了人工晶体吗?",
"role": "doctor"
},
{
"turn": 3,
"text": "今天检验白内障,眼内膜只有500,医生说不能做手术。",
"role": "patient"
},
{
"turn": 4,
"text": "方便把报告单发来看一下吗?",
"role": "doctor"
},
{
"turn": 5,
"text": "医院还没有给结果给我,检查后给我们说了一下。",
"role": "patient"
}
]
}
``` |
A-Bar/ar-vi_non_top_cs_dev | ---
dataset_info:
features:
- name: query
dtype: string
- name: passage
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 40382015
num_examples: 100000
download_size: 16672829
dataset_size: 40382015
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ar-vi_non_top_cs_dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/sub4000ctx | ---
license: mit
---
This is a filtered ConvoEvol for sub4000 tokens on LLAMA-2 encoder
There is 4.7k turns of human/assistant conversations in this dataset. Unfiltered for refusals (future work) |
lvdthieu/solfile | ---
license: mit
---
|
tyzhu/p2d_raw | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 251092
num_examples: 900
- name: test
num_bytes: 80304
num_examples: 300
- name: validation
num_bytes: 80304
num_examples: 300
download_size: 76658
dataset_size: 411700
---
# Dataset Card for "p2d_raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ingaleakashtest/custom_data_1 | ---
dataset_info:
features:
- name: structurer
dtype: string
- name: unstructured
dtype: string
splits:
- name: train
num_bytes: 32
num_examples: 2
- name: test
num_bytes: 15
num_examples: 1
download_size: 2471
dataset_size: 47
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
dataunitylab/json-schema-store | ---
language:
- en
tags:
- json
pretty_name: JSON Schema Store
size_categories:
- n<1K
---
This contains a set of schemas obtained via the [JSON Schema Store catalog](https://github.com/SchemaStore/schemastore/blob/master/src/api/json/catalog.json). |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.