datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
davidberg/sentiment-reviews | ---
license: postgresql
---
|
fxmeng/llava-pretrain | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 92854190
num_examples: 558128
download_size: 36868547
dataset_size: 92854190
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llava-pretrain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Azure99__blossom-v5-14b | ---
pretty_name: Evaluation run of Azure99/blossom-v5-14b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azure99/blossom-v5-14b](https://huggingface.co/Azure99/blossom-v5-14b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azure99__blossom-v5-14b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-14T15:46:32.836324](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v5-14b/blob/main/results_2024-03-14T15-46-32.836324.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6816759115536367,\n\
\ \"acc_stderr\": 0.031716119196988427,\n \"acc_norm\": 0.6848635524592945,\n\
\ \"acc_norm_stderr\": 0.03234818732415741,\n \"mc1\": 0.379436964504284,\n\
\ \"mc1_stderr\": 0.01698703926614298,\n \"mc2\": 0.5488583912060274,\n\
\ \"mc2_stderr\": 0.01518635186533139\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5341296928327645,\n \"acc_stderr\": 0.014577311315231099,\n\
\ \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.014401366641216386\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6036646086436964,\n\
\ \"acc_stderr\": 0.0048813595891490005,\n \"acc_norm\": 0.8072097191794463,\n\
\ \"acc_norm_stderr\": 0.003936835566749194\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145631,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145631\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0349610148119118,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0349610148119118\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n\
\ \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n\
\ \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263714,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263714\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.7021276595744681,\n \"acc_stderr\": 0.029896145682095455,\n \"\
acc_norm\": 0.7021276595744681,\n \"acc_norm_stderr\": 0.029896145682095455\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6827586206896552,\n \"acc_stderr\": 0.03878352372138622,\n\
\ \"acc_norm\": 0.6827586206896552,\n \"acc_norm_stderr\": 0.03878352372138622\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5925925925925926,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.832258064516129,\n\
\ \"acc_stderr\": 0.02125546406537131,\n \"acc_norm\": 0.832258064516129,\n\
\ \"acc_norm_stderr\": 0.02125546406537131\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822032,\n\
\ \"acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822032\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.717948717948718,\n \"acc_stderr\": 0.022815813098896614,\n \
\ \"acc_norm\": 0.717948717948718,\n \"acc_norm_stderr\": 0.022815813098896614\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.43333333333333335,\n \"acc_stderr\": 0.030213340289237927,\n \
\ \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.030213340289237927\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.027722065493361255,\n\
\ \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.027722065493361255\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8623853211009175,\n \"acc_stderr\": 0.014770105878649407,\n \"\
acc_norm\": 0.8623853211009175,\n \"acc_norm_stderr\": 0.014770105878649407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.033247089118091176,\n \"\
acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.033247089118091176\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654386,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654386\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8312236286919831,\n \"acc_stderr\": 0.02438140683258623,\n \
\ \"acc_norm\": 0.8312236286919831,\n \"acc_norm_stderr\": 0.02438140683258623\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7219730941704036,\n\
\ \"acc_stderr\": 0.030069584874494033,\n \"acc_norm\": 0.7219730941704036,\n\
\ \"acc_norm_stderr\": 0.030069584874494033\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.02126271940040697,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.02126271940040697\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608313,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608313\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n\
\ \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4569832402234637,\n\
\ \"acc_stderr\": 0.01666049858050917,\n \"acc_norm\": 0.4569832402234637,\n\
\ \"acc_norm_stderr\": 0.01666049858050917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.02508947852376513,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.02508947852376513\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n\
\ \"acc_stderr\": 0.012761104871472658,\n \"acc_norm\": 0.4810951760104302,\n\
\ \"acc_norm_stderr\": 0.012761104871472658\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6928104575163399,\n \"acc_stderr\": 0.01866335967146367,\n \
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.01866335967146367\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7918367346938775,\n \"acc_stderr\": 0.025991117672813296,\n\
\ \"acc_norm\": 0.7918367346938775,\n \"acc_norm_stderr\": 0.025991117672813296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.379436964504284,\n\
\ \"mc1_stderr\": 0.01698703926614298,\n \"mc2\": 0.5488583912060274,\n\
\ \"mc2_stderr\": 0.01518635186533139\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7513812154696132,\n \"acc_stderr\": 0.012147314713403107\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6777862016679302,\n \
\ \"acc_stderr\": 0.012872435481188778\n }\n}\n```"
repo_url: https://huggingface.co/Azure99/blossom-v5-14b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|arc:challenge|25_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|gsm8k|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hellaswag|10_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T15-46-32.836324.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T15-46-32.836324.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- '**/details_harness|winogrande|5_2024-03-14T15-46-32.836324.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-14T15-46-32.836324.parquet'
- config_name: results
data_files:
- split: 2024_03_14T15_46_32.836324
path:
- results_2024-03-14T15-46-32.836324.parquet
- split: latest
path:
- results_2024-03-14T15-46-32.836324.parquet
---
# Dataset Card for Evaluation run of Azure99/blossom-v5-14b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azure99/blossom-v5-14b](https://huggingface.co/Azure99/blossom-v5-14b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azure99__blossom-v5-14b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-14T15:46:32.836324](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v5-14b/blob/main/results_2024-03-14T15-46-32.836324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6816759115536367,
"acc_stderr": 0.031716119196988427,
"acc_norm": 0.6848635524592945,
"acc_norm_stderr": 0.03234818732415741,
"mc1": 0.379436964504284,
"mc1_stderr": 0.01698703926614298,
"mc2": 0.5488583912060274,
"mc2_stderr": 0.01518635186533139
},
"harness|arc:challenge|25": {
"acc": 0.5341296928327645,
"acc_stderr": 0.014577311315231099,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.014401366641216386
},
"harness|hellaswag|10": {
"acc": 0.6036646086436964,
"acc_stderr": 0.0048813595891490005,
"acc_norm": 0.8072097191794463,
"acc_norm_stderr": 0.003936835566749194
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145631,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145631
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6827586206896552,
"acc_stderr": 0.03878352372138622,
"acc_norm": 0.6827586206896552,
"acc_norm_stderr": 0.03878352372138622
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.832258064516129,
"acc_stderr": 0.02125546406537131,
"acc_norm": 0.832258064516129,
"acc_norm_stderr": 0.02125546406537131
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822032,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822032
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.022815813098896614,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.022815813098896614
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.030213340289237927,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.030213340289237927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7605042016806722,
"acc_stderr": 0.027722065493361255,
"acc_norm": 0.7605042016806722,
"acc_norm_stderr": 0.027722065493361255
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8623853211009175,
"acc_stderr": 0.014770105878649407,
"acc_norm": 0.8623853211009175,
"acc_norm_stderr": 0.014770105878649407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654386,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654386
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8312236286919831,
"acc_stderr": 0.02438140683258623,
"acc_norm": 0.8312236286919831,
"acc_norm_stderr": 0.02438140683258623
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7219730941704036,
"acc_stderr": 0.030069584874494033,
"acc_norm": 0.7219730941704036,
"acc_norm_stderr": 0.030069584874494033
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.02126271940040697,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.02126271940040697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608313,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608313
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4569832402234637,
"acc_stderr": 0.01666049858050917,
"acc_norm": 0.4569832402234637,
"acc_norm_stderr": 0.01666049858050917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.02508947852376513,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.02508947852376513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4810951760104302,
"acc_stderr": 0.012761104871472658,
"acc_norm": 0.4810951760104302,
"acc_norm_stderr": 0.012761104871472658
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.01866335967146367,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.01866335967146367
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7918367346938775,
"acc_stderr": 0.025991117672813296,
"acc_norm": 0.7918367346938775,
"acc_norm_stderr": 0.025991117672813296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827217,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827217
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.379436964504284,
"mc1_stderr": 0.01698703926614298,
"mc2": 0.5488583912060274,
"mc2_stderr": 0.01518635186533139
},
"harness|winogrande|5": {
"acc": 0.7513812154696132,
"acc_stderr": 0.012147314713403107
},
"harness|gsm8k|5": {
"acc": 0.6777862016679302,
"acc_stderr": 0.012872435481188778
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
NghiemAbe/sts15 | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_bytes: 539971
num_examples: 3000
download_size: 247502
dataset_size: 539971
task_categories:
- sentence-similarity
language:
- vi
---
# Dataset Card for "sts15"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roa7n/patched_test_p_10_f_ATCaseOTCase_v4 | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 52568328
num_examples: 143667
download_size: 5044378
dataset_size: 52568328
---
# Dataset Card for "patched_test_p_10_f_ATCaseOTCase_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sheik21/musica-leo | ---
license: openrail
---
|
vargr/yt_thumbnail_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
- name: title
dtype: string
- name: videoId
dtype: string
- name: channelId
dtype: string
- name: subscribers
dtype: float64
- name: isVerified
dtype: bool
- name: keywords
dtype: string
- name: country
dtype: string
- name: description
dtype: string
- name: views
dtype: int64
- name: published
dtype: timestamp[us]
- name: length
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 3917528866.3737946
num_examples: 28276
- name: test
num_bytes: 1010554492.3202056
num_examples: 7070
download_size: 5006700814
dataset_size: 4928083358.694
---
# Dataset Card for "yt_thumbnail_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wu981526092/LL144 | ---
license: mit
---
|
lvdthieu/codegen-v1 | ---
license: mit
---
|
sunilSabnis/pixelart | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 831007653.0
num_examples: 2000
download_size: 831037182
dataset_size: 831007653.0
---
# pixel giffusion
Dataset of pixel-style art generated from stable-diffusion model |
facebook/emu_edit_test_set | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: image
dtype: image
- name: task
dtype: string
- name: split
dtype: string
- name: idx
dtype: int64
- name: hash
dtype: string
- name: input_caption
dtype: string
- name: output_caption
dtype: string
splits:
- name: validation
num_bytes: 766327032.29
num_examples: 2022
- name: test
num_bytes: 1353530752.0
num_examples: 3589
download_size: 1904598290
dataset_size: 2119857784.29
---
# Dataset Card for the Emu Edit Test Set
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage: https://emu-edit.metademolab.com/**
- **Paper: https://emu-edit.metademolab.com/assets/emu_edit.pdf**
### Dataset Summary
To create a benchmark for image editing we first define seven different categories of potential image editing operations: background alteration (background), comprehensive image changes (global), style alteration (style), object removal (remove), object addition (add), localized modifications (local), and color/texture alterations (texture).
Then, we utilize the diverse set of input images from the [MagicBrush benchmark](https://huggingface.co/datasets/osunlp/MagicBrush), and for each editing operation, we task crowd workers to devise relevant, creative, and challenging instructions.
Moreover, to increase the quality of the collected examples, we apply a post-verification stage, in which crowd workers filter examples with irrelevant instructions.
Finally, to support evaluation for methods that require input and output captions (e.g. prompt2prompt and pnp), we additionally collect an input caption and output caption for each example.
When doing so, we ask annotators to ensure that the captions capture both important elements in the image, and elements that should change based on the instruction.
Additionally, to support proper comparison with Emu Edit with publicly release the model generations on the test set [here](https://huggingface.co/datasets/facebook/emu_edit_test_set_generations).
For more details please see our [paper](https://emu-edit.metademolab.com/assets/emu_edit.pdf) and [project page](https://emu-edit.metademolab.com/).
### Licensing Information
Licensed with CC-BY-NC 4.0 License available [here](https://creativecommons.org/licenses/by-nc/4.0/legalcode?fbclid=IwAR2SYZjLRywwUMblkWg0LyAxHVVTloIFlvC-ju3BthIYtOM2jpQHgbeXOsM).
### Citation Information
```
@inproceedings{Sheynin2023EmuEP,
title={Emu Edit: Precise Image Editing via Recognition and Generation Tasks},
author={Shelly Sheynin and Adam Polyak and Uriel Singer and Yuval Kirstain and Amit Zohar and Oron Ashual and Devi Parikh and Yaniv Taigman},
year={2023},
url={https://api.semanticscholar.org/CorpusID:265221391}
}
``` |
datahrvoje/twitter_dataset_1713063360 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 24415
num_examples: 56
download_size: 12658
dataset_size: 24415
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kiwihead15/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 25627570
num_examples: 4500
download_size: 7330125
dataset_size: 25627570
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mrm8488/en_es_sample_good | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 1403
num_examples: 20
download_size: 2642
dataset_size: 1403
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
makram93/accepted_pairs_st | ---
dataset_info:
features:
- name: url
dtype: string
- name: doc_id
dtype: string
- name: original_title
sequence: string
- name: right
dtype: string
- name: left
dtype: string
splits:
- name: train
num_bytes: 88447.0623234648
num_examples: 100
download_size: 87877
dataset_size: 88447.0623234648
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "accepted_pairs_st"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
USC/USC-Course-Catalog | ---
license: apache-2.0
---
## USC Course Catalog - 2024, Spring Term
This dataset consists of all classes provided by USC (as of December 1, 2023) that USC is providing in 2024 Spring.
While it is a small dataset, this could be used in some finetuning, generation, or RAG application tasks. One example would be this -> https://huggingface.co/spaces/USC/USC-GPT
**I will also be scraping the 2024 fall term classes when they are released by USC!**
If you want the web scraping script I used for this, feel free to send me an email at brandonhulston1@gmail.com:) |
johnny9210/instruction_011 | ---
task_categories:
- question-answering
license: apache-2.0
--- |
VictorHProtogen/Amigos | ---
license: openrail
---
|
AnanthZeke/oscar_tamil_2201 | ---
dataset_info:
features:
- name: text
dtype: string
- name: sent_token
sequence: string
splits:
- name: train
num_bytes: 18576297122
num_examples: 556772
download_size: 6242500521
dataset_size: 18576297122
---
# Dataset Card for "oscar_tamil_2201"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Corianas__1.3b | ---
pretty_name: Evaluation run of Corianas/1.3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Corianas/1.3b](https://huggingface.co/Corianas/1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Corianas__1.3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T03:40:11.445495](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__1.3b/blob/main/results_2023-10-15T03-40-11.445495.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.00045666764626669994,\n \"f1\": 0.045740352348993464,\n\
\ \"f1_stderr\": 0.001213536763017523,\n \"acc\": 0.2659515202794684,\n\
\ \"acc_stderr\": 0.007549145093989003\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626669994,\n\
\ \"f1\": 0.045740352348993464,\n \"f1_stderr\": 0.001213536763017523\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \
\ \"acc_stderr\": 0.0010717793485492606\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5303867403314917,\n \"acc_stderr\": 0.014026510839428746\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Corianas/1.3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T03_40_11.445495
path:
- '**/details_harness|drop|3_2023-10-15T03-40-11.445495.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T03-40-11.445495.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T03_40_11.445495
path:
- '**/details_harness|gsm8k|5_2023-10-15T03-40-11.445495.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T03-40-11.445495.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:03:11.668296.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:03:11.668296.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:03:11.668296.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T03_40_11.445495
path:
- '**/details_harness|winogrande|5_2023-10-15T03-40-11.445495.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T03-40-11.445495.parquet'
- config_name: results
data_files:
- split: 2023_08_18T07_03_11.668296
path:
- results_2023-08-18T07:03:11.668296.parquet
- split: 2023_10_15T03_40_11.445495
path:
- results_2023-10-15T03-40-11.445495.parquet
- split: latest
path:
- results_2023-10-15T03-40-11.445495.parquet
---
# Dataset Card for Evaluation run of Corianas/1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Corianas/1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Corianas/1.3b](https://huggingface.co/Corianas/1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Corianas__1.3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T03:40:11.445495](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__1.3b/blob/main/results_2023-10-15T03-40-11.445495.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626669994,
"f1": 0.045740352348993464,
"f1_stderr": 0.001213536763017523,
"acc": 0.2659515202794684,
"acc_stderr": 0.007549145093989003
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626669994,
"f1": 0.045740352348993464,
"f1_stderr": 0.001213536763017523
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492606
},
"harness|winogrande|5": {
"acc": 0.5303867403314917,
"acc_stderr": 0.014026510839428746
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
vincenttttt/department_college_ForFineTune | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1719829
num_examples: 3673
download_size: 312305
dataset_size: 1719829
---
# Dataset Card for "department_college_ForFineTune"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roszcz/ecg-segmentation-ltafdb | ---
dataset_info:
features:
- name: record_id
dtype: string
- name: signal
dtype:
array2_d:
shape:
- 2
- 1000
dtype: float32
- name: mask
dtype:
array2_d:
shape:
- 1
- 1000
dtype: int8
splits:
- name: train
num_bytes: 6591714200
num_examples: 730278
- name: validation
num_bytes: 755744025
num_examples: 83724
- name: test
num_bytes: 807009592
num_examples: 89407
download_size: 2229542434
dataset_size: 8154467817
---
# Dataset Card for "ecg-segmentation-ltafdb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
soarescmsa/capes | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
- pt
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: capes
pretty_name: CAPES
tags:
- dissertation-abstracts-translation
- theses-translation
dataset_info:
config_name: en-pt
features:
- name: translation
dtype:
translation:
languages:
- en
- pt
splits:
- name: train
num_bytes: 472483436
num_examples: 1157610
download_size: 285468020
dataset_size: 472483436
configs:
- config_name: en-pt
data_files:
- split: train
path: en-pt/train-*
default: true
---
# Dataset Card for CAPES
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Parallel corpus of theses and dissertation abstracts in Portuguese and English from CAPES](https://sites.google.com/view/felipe-soares/datasets)
- **Repository:**
- **Paper:** [A Parallel Corpus of Theses and Dissertations Abstracts](https://arxiv.org/abs/1905.01715)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
A parallel corpus of theses and dissertations abstracts in English and Portuguese were collected from the
CAPES website (Coordenação de Aperfeiçoamento de Pessoal de Nível Superior) - Brazil.
The corpus is sentence aligned for all language pairs. Approximately 240,000 documents were
collected and aligned using the Hunalign algorithm.
### Supported Tasks and Leaderboards
The underlying task is machine translation.
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@inproceedings{soares2018parallel,
title={A Parallel Corpus of Theses and Dissertations Abstracts},
author={Soares, Felipe and Yamashita, Gabrielli Harumi and Anzanello, Michel Jose},
booktitle={International Conference on Computational Processing of the Portuguese Language},
pages={345--352},
year={2018},
organization={Springer}
}
```
### Contributions
Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset. |
mccoole/flowers-demo | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 347100141.78
num_examples: 8189
download_size: 346653261
dataset_size: 347100141.78
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arieg/cluster04_large_10 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '000667'
'1': 001039
'2': 001083
'3': '001663'
'4': 001930
'5': '003766'
'6': 003840
'7': 009511
'8': '010676'
'9': '011334'
'10': 012349
'11': '012513'
'12': 013596
'13': '013666'
'14': 024368
'15': '025324'
'16': '026674'
'17': 028070
'18': 028072
'19': '032327'
'20': '037727'
'21': 039484
'22': 041095
'23': '042761'
'24': 043796
'25': 043886
'26': 044918
'27': '046024'
'28': 047895
'29': 048439
'30': '052631'
'31': 053592
'32': 058333
'33': 061493
'34': '062337'
'35': '062445'
'36': 062458
'37': '063043'
'38': '063045'
'39': '063117'
'40': 064659
'41': '067163'
'42': 069202
'43': '072456'
'44': '073342'
'45': '073343'
'46': '073371'
'47': 073486
'48': 073921
'49': 074669
'50': 080516
'51': 080517
'52': 085787
'53': 085791
'54': 086037
'55': 088870
'56': 090639
'57': 091083
'58': 091158
'59': 091159
'60': 093867
'61': 094348
'62': 096408
'63': 099419
'64': '105722'
'65': '106953'
'66': '107188'
'67': '107391'
'68': '107616'
'69': '110637'
'70': '110983'
'71': '111335'
'72': '111376'
'73': '111391'
'74': '111397'
'75': '112734'
'76': '112767'
'77': '114415'
'78': '119027'
'79': '120296'
'80': '120467'
'81': '122081'
'82': '122087'
'83': '122088'
'84': '122472'
'85': '122630'
'86': '125774'
'87': '126224'
'88': '126608'
'89': '129088'
'90': '129094'
'91': '129095'
'92': '129096'
'93': '129097'
'94': '131448'
'95': '131451'
'96': '131452'
'97': '131453'
'98': '131552'
'99': '133023'
'100': '133025'
'101': '133027'
'102': '133275'
'103': '139772'
'104': '140576'
'105': '141594'
'106': '142402'
'107': '143098'
'108': '143989'
'109': '143995'
'110': '145761'
'111': '148536'
splits:
- name: train
num_bytes: 57038864.96
num_examples: 1120
download_size: 57043791
dataset_size: 57038864.96
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bz-arc13/evol_instruct_zh_gpt4 | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 115159436.75138572
num_examples: 68937
download_size: 67843518
dataset_size: 115159436.75138572
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_0-hero__Matter-0.1-7B-DPO-preview | ---
pretty_name: Evaluation run of 0-hero/Matter-0.1-7B-DPO-preview
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [0-hero/Matter-0.1-7B-DPO-preview](https://huggingface.co/0-hero/Matter-0.1-7B-DPO-preview)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_0-hero__Matter-0.1-7B-DPO-preview\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-23T05:48:47.699955](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.1-7B-DPO-preview/blob/main/results_2024-03-23T05-48-47.699955.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6279706308113411,\n\
\ \"acc_stderr\": 0.03270077035026703,\n \"acc_norm\": 0.630444370224374,\n\
\ \"acc_norm_stderr\": 0.033360535148631236,\n \"mc1\": 0.3157894736842105,\n\
\ \"mc1_stderr\": 0.01627228795791692,\n \"mc2\": 0.4579010104878946,\n\
\ \"mc2_stderr\": 0.01473730415975875\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5853242320819113,\n \"acc_stderr\": 0.014397070564409174,\n\
\ \"acc_norm\": 0.6271331058020477,\n \"acc_norm_stderr\": 0.014131176760131165\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6314479187412866,\n\
\ \"acc_stderr\": 0.004814261966376849,\n \"acc_norm\": 0.8299143596893049,\n\
\ \"acc_norm_stderr\": 0.003749401775087307\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532265,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532265\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.02507598176760168,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.02507598176760168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7483870967741936,\n \"acc_stderr\": 0.02468597928623995,\n \"\
acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.02468597928623995\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n\
\ \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0287420409039485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0287420409039485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603396,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603396\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n\
\ \"acc_stderr\": 0.016129271025099853,\n \"acc_norm\": 0.8293577981651377,\n\
\ \"acc_norm_stderr\": 0.016129271025099853\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n\
\ \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876163,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876163\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.02536116874968822,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.02536116874968822\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4145251396648045,\n\
\ \"acc_stderr\": 0.016476342210253996,\n \"acc_norm\": 0.4145251396648045,\n\
\ \"acc_norm_stderr\": 0.016476342210253996\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n\
\ \"acc_stderr\": 0.012713845972358981,\n \"acc_norm\": 0.4530638852672751,\n\
\ \"acc_norm_stderr\": 0.012713845972358981\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355435,\n \
\ \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355435\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3157894736842105,\n\
\ \"mc1_stderr\": 0.01627228795791692,\n \"mc2\": 0.4579010104878946,\n\
\ \"mc2_stderr\": 0.01473730415975875\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.01147774768422319\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5617892342683851,\n \
\ \"acc_stderr\": 0.013666915917255067\n }\n}\n```"
repo_url: https://huggingface.co/0-hero/Matter-0.1-7B-DPO-preview
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|arc:challenge|25_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|gsm8k|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hellaswag|10_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T05-48-47.699955.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-23T05-48-47.699955.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- '**/details_harness|winogrande|5_2024-03-23T05-48-47.699955.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-23T05-48-47.699955.parquet'
- config_name: results
data_files:
- split: 2024_03_23T05_48_47.699955
path:
- results_2024-03-23T05-48-47.699955.parquet
- split: latest
path:
- results_2024-03-23T05-48-47.699955.parquet
---
# Dataset Card for Evaluation run of 0-hero/Matter-0.1-7B-DPO-preview
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [0-hero/Matter-0.1-7B-DPO-preview](https://huggingface.co/0-hero/Matter-0.1-7B-DPO-preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_0-hero__Matter-0.1-7B-DPO-preview",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-23T05:48:47.699955](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.1-7B-DPO-preview/blob/main/results_2024-03-23T05-48-47.699955.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6279706308113411,
"acc_stderr": 0.03270077035026703,
"acc_norm": 0.630444370224374,
"acc_norm_stderr": 0.033360535148631236,
"mc1": 0.3157894736842105,
"mc1_stderr": 0.01627228795791692,
"mc2": 0.4579010104878946,
"mc2_stderr": 0.01473730415975875
},
"harness|arc:challenge|25": {
"acc": 0.5853242320819113,
"acc_stderr": 0.014397070564409174,
"acc_norm": 0.6271331058020477,
"acc_norm_stderr": 0.014131176760131165
},
"harness|hellaswag|10": {
"acc": 0.6314479187412866,
"acc_stderr": 0.004814261966376849,
"acc_norm": 0.8299143596893049,
"acc_norm_stderr": 0.003749401775087307
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532265,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532265
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.02507598176760168,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.02507598176760168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.02468597928623995,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.02468597928623995
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6230769230769231,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.6230769230769231,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0287420409039485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0287420409039485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603396,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603396
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099853,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099853
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876163,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876163
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.02536116874968822,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.02536116874968822
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4145251396648045,
"acc_stderr": 0.016476342210253996,
"acc_norm": 0.4145251396648045,
"acc_norm_stderr": 0.016476342210253996
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358981,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358981
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355435,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355435
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3157894736842105,
"mc1_stderr": 0.01627228795791692,
"mc2": 0.4579010104878946,
"mc2_stderr": 0.01473730415975875
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.01147774768422319
},
"harness|gsm8k|5": {
"acc": 0.5617892342683851,
"acc_stderr": 0.013666915917255067
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
triobaba/question_pair | ---
license: apache-2.0
---
|
Falah/2M_landscape_cities_SDXL_refiner_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 1212663735
num_examples: 2000000
download_size: 142133423
dataset_size: 1212663735
---
# Dataset Card for "2M_landscape_cities_SDXL_refiner_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_75_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4858013
num_examples: 2453
download_size: 2582264
dataset_size: 4858013
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_75_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/205_People_Mandarin_Speech_Data_in_Noisy_Environment_by_Mobile_Phone_Guiding | ---
license: cc-by-nc-nd-4.0
---
## Description
205 People Accent Mandarin Speech Data in Noise Environment _ G. Speakers recorded their speech in accent mandarin in various daily scenarios in noisy environment. The recordings cover categories like in-car scene, smart home, smart speech assistant. It be used for speech recognition acoustics language model training and algorithm research, machine translation corpus construction, voiceprint recognition model training and algorithm research.
For more details, please refer to the link: https://www.nexdata.ai/dataset/192?source=Huggingface
## Format
16kHz, 16bit, uncompressed wav, mono channel
## Recording Environment
Noisy environment including subway, market, restaurant, street, airport etc.
## Recording Content
smart car; smart home; speech assistant
## Speaker
205 people, 58% of which are male
## Device
Android mobile phone; iPhone
## Language
mandarin (without heavy local accent)
## Transcription content
text, noise symbols
## Accuracy rate
95% (the accuracy rate of noise symbols is not included)
## Application scenarios
speech recognition, voiceprint recognition
# Licensing Information
Commercial License
|
yaygomii/Tamil-Speech-Dialect-Corpus-Shuffled-Split | ---
dataset_info:
features:
- name: label
dtype: string
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 2441162650.9392853
num_examples: 8649
- name: test
num_bytes: 305392291.844953
num_examples: 1082
- name: valid
num_bytes: 305109988.75476176
num_examples: 1081
download_size: 2872546550
dataset_size: 3051664931.539
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
zhangxuedong/lora_weitiao | ---
license: unknown
---
|
Yehor/ukrainian-tts-lada | ---
language:
- uk
license: apache-2.0
task_categories:
- text-to-speech
---
# 🇺🇦 Open Source Ukrainian Text-to-Speech dataset named LADA
Join Ukrainian community - https://t.me/speech_synthesis_uk
More details about this dataset - https://github.com/egorsmkv/ukrainian-tts-datasets/tree/main/lada
# Voice LADA (female)
License: [Apache 2.0](https://github.com/egorsmkv/ukrainian-tts-datasets/blob/main/LICENSE)
**Samples are manually checked. Good samples are in the `accept` folder (10h37m), others are in `reject` (1h2m).**
Listen to [DEMO](https://huggingface.co/spaces/theodotus/ukrainian-voices) (choose "lada" in the Voice field)
## Features
- Quality: high
- Duration: 10h37m
- Audio formats: OPUS/WAV
- Text format: JSONL (a `metadata.jsonl` file)
- Frequency: 16000/22050/48000 Hz
## Original version
### In the `OPUS` format
- 48000 Hz: https://huggingface.co/datasets/Yehor/ukrainian-tts-lada/resolve/main/dataset_lada_ogg.zip
### In the `WAV` format
- 48000 Hz: https://huggingface.co/datasets/Yehor/ukrainian-tts-lada/resolve/main/dataset_lada_48khz.zip
- 22050 Hz: https://huggingface.co/datasets/Yehor/ukrainian-tts-lada/resolve/main/dataset_lada_22khz.zip
- 16000 Hz: https://huggingface.co/datasets/Yehor/ukrainian-tts-lada/resolve/main/dataset_lada_16khz.zip
## Trimmed version (removed silence)
Silence is removed by https://github.com/proger/uk#align-text-to-audio-and-trim-silence
### In the `WAV` format
- 48000 Hz: https://huggingface.co/datasets/Yehor/ukrainian-tts-lada/resolve/main/dataset_lada_trimmed_48khz.zip
- 22050 Hz: https://huggingface.co/datasets/Yehor/ukrainian-tts-lada/resolve/main/dataset_lada_trimmed_22khz.zip
- 16000 Hz: https://huggingface.co/datasets/Yehor/ukrainian-tts-lada/resolve/main/dataset_lada_trimmed_16khz.zip |
StringTheory69/ApiToForm | ---
license: openrail
---
|
praveensai266/orca-llama2-2k-samples | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1120566.8850139694
num_examples: 657
download_size: 800199
dataset_size: 1120566.8850139694
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nexdata/Spanish_Speaking_English_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Spanish_Speaking_English_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/990?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
891 Spanish native speakers participated in the recording with authentic accent. The recorded script is designed by linguists and cover a wide range of topics including generic, interactive, on-board and home. The text is manually proofread with high accuracy. It matches with mainstream Android and Apple system phones. The data set can be applied for automatic speech recognition, and machine translation scenes.
For more details, please refer to the link: https://www.nexdata.ai/datasets/990?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Spanish English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
pharaouk/mls-eng-10k-tags_tagged_10k_generated | ---
pretty_name: Annotations of 10K hours of English MLS
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- multilingual
paperswithcode_id: multilingual-librispeech
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- automatic-speech-recognition
- text-to-speech
- text-to-audio
dataset_info:
features:
- name: original_path
dtype: string
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: audio_duration
dtype: float64
- name: speaker_id
dtype: string
- name: book_id
dtype: string
- name: utterance_pitch_mean
dtype: float32
- name: utterance_pitch_std
dtype: float32
- name: snr
dtype: float64
- name: c50
dtype: float64
- name: speaking_rate
dtype: string
- name: phonemes
dtype: string
- name: gender
dtype: string
- name: pitch
dtype: string
- name: noise
dtype: string
- name: reverberation
dtype: string
- name: speech_monotony
dtype: string
- name: text_description
dtype: string
- name: original_text
dtype: string
- name: text
dtype: string
splits:
- name: dev
num_bytes: 4378721
num_examples: 3807
- name: test
num_bytes: 4360862
num_examples: 3769
- name: train
num_bytes: 2779317208
num_examples: 2420047
download_size: 1438356670
dataset_size: 2788056791
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
# Dataset Card for Annotations of 10K hours of English MLS
This dataset consists in **annotations of a 10K hours** subset of **[English version of the Multilingual LibriSpeech (MLS) dataset](https://huggingface.co/datasets/parler-tts/mls_eng)**.
MLS dataset is a large multilingual corpus suitable for speech research. The dataset is derived from read audiobooks from LibriVox and consists of
8 languages - English, German, Dutch, Spanish, French, Italian, Portuguese, Polish. It includes about 44.5K hours of English and a total of about 6K hours for other languages.
This dataset includes an annotation of [a 10K hours subset](https://huggingface.co/datasets/parler-tts/mls_eng_10k) of English MLS. Refers to this [dataset card](https://huggingface.co/datasets/facebook/multilingual_librispeech) for the other languages.
The `text_description` column provides natural language annotations on the characteristics of speakers and utterances, that have been generated using [the Data-Speech repository](https://github.com/huggingface/dataspeech).
This dataset was used alongside its [original version](https://huggingface.co/datasets/parler-tts/mls_eng_10k) and [LibriTTS-R](https://huggingface.co/datasets/blabble-io/libritts_r) to train [Parler-TTS Mini v0.1](https://huggingface.co/parler-tts/parler_tts_mini_v0.1).
A training recipe is available in [the Parler-TTS library](https://github.com/huggingface/parler-tts).
## Usage
Here is an example on how to load the only the `train` split.
```
load_dataset("parler-tts/mls-eng-10k-tags_tagged_10k_generated", split="train")
```
Streaming is also supported.
```
load_dataset("parler-tts/libritts_r_tags_tagged_10k_generated", streaming=True)
```
**Note:** This dataset doesn't actually keep track of the audio column of the original version. You can merge it back to the original dataset using [this script](https://github.com/huggingface/dataspeech/blob/main/scripts/merge_audio_to_metadata.py) from Parler-TTS or, even better, get inspiration from [the training script](https://github.com/ylacombe/parler-tts/blob/3c8822985fe6cec482ecf868b04e866428bcd7bc/training/run_parler_tts_training.py#L648) of Parler-TTS, that efficiently process multiple annotated datasets.
### Motivation
This dataset is a reproduction of work from the paper [Natural language guidance of high-fidelity text-to-speech with synthetic annotations](https://www.text-description-to-speech.com) by Dan Lyth and Simon King, from Stability AI and Edinburgh University respectively.
It was designed to train the [Parler-TTS Mini v0.1](https://huggingface.co/parler-tts/parler_tts_mini_v0.1) model.
Contrarily to other TTS models, Parler-TTS is a **fully open-source** release. All of the datasets, pre-processing, training code and weights are released publicly under permissive license, enabling the community to build on our work and develop their own powerful TTS models.
Parler-TTS was released alongside:
* [The Parler-TTS repository](https://github.com/huggingface/parler-tts) - you can train and fine-tuned your own version of the model.
* [The Data-Speech repository](https://github.com/huggingface/dataspeech) - a suite of utility scripts designed to annotate speech datasets.
* [The Parler-TTS organization](https://huggingface.co/parler-tts) - where you can find the annotated datasets as well as the future checkpoints.
### License
Public Domain, Creative Commons Attribution 4.0 International Public License ([CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/legalcode))
## Citation
```
@article{Pratap2020MLSAL,
title={MLS: A Large-Scale Multilingual Dataset for Speech Research},
author={Vineel Pratap and Qiantong Xu and Anuroop Sriram and Gabriel Synnaeve and Ronan Collobert},
journal={ArXiv},
year={2020},
volume={abs/2012.03411}
}
```
```
@misc{lacombe-etal-2024-dataspeech,
author = {Yoach Lacombe and Vaibhav Srivastav and Sanchit Gandhi},
title = {Data-Speech},
year = {2024},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/ylacombe/dataspeech}}
}
```
```
@misc{lyth2024natural,
title={Natural language guidance of high-fidelity text-to-speech with synthetic annotations},
author={Dan Lyth and Simon King},
year={2024},
eprint={2402.01912},
archivePrefix={arXiv},
primaryClass={cs.SD}
}
``` |
JLB-JLB/seizure_detection_224x224_raw_frequency | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: eval
path: data/eval-*
- split: test_bckg_events
path: data/test_bckg_events-*
dataset_info:
features:
- name: image
dtype: image
- name: epoch_index
dtype: int32
- name: label
dtype:
class_label:
names:
'0': bckg
'1': seiz
splits:
- name: train
num_bytes: 2654825157.304
num_examples: 93128
- name: test
num_bytes: 898252847.927854
num_examples: 31384
- name: eval
num_bytes: 598524001.8931462
num_examples: 20923
- name: test_bckg_events
num_bytes: 9520809814.634
num_examples: 338634
download_size: 13707064997
dataset_size: 13672411821.759
---
# Dataset Card for "seizure_detection_224x224_raw_frequency"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pesc101/CodeAlpacpa-20k-llama-format | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6535066
num_examples: 20022
download_size: 3269704
dataset_size: 6535066
---
# Dataset Card for "CodeAlpacpa-20k-llama-format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
towhid/aesir-test69 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 22114
num_examples: 10
download_size: 28277
dataset_size: 22114
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "aesir-test69"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EALeon16/poems | ---
license: wtfpl
---
|
joyliao7777/LLaRA | ---
license: apache-2.0
---
|
pushkin05/ds081309 | ---
dataset_info:
features:
- name: code
dtype: string
- name: api_call_
dtype: string
- name: provider
dtype: string
- name: domain
dtype: string
- name: framework
dtype: string
- name: functionality
dtype: string
- name: api_name
dtype: string
- name: api_call
dtype: string
- name: api_arguments
dtype: string
- name: python_environment_requirements
dtype: string
- name: example_code
dtype: string
- name: performance
dtype: string
- name: description
dtype: string
- name: instruction
dtype: string
- name: emb_instruction_0
sequence: float32
- name: emb_instruction_1
sequence: float32
splits:
- name: train
num_bytes: 71806553
num_examples: 8191
download_size: 67945522
dataset_size: 71806553
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ds081309"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
unanam/fleurstest | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2433420592
num_examples: 2533
- name: test
num_bytes: 366985472
num_examples: 382
download_size: 1079073759
dataset_size: 2800406064
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
davanstrien/autotrain-data-mapreader-5000 | Invalid username or password. |
CyberHarem/welrod_mkii_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of welrod_mkii/ウェルロッドMkII/维尔德MkⅡ (Girls' Frontline)
This is the dataset of welrod_mkii/ウェルロッドMkII/维尔德MkⅡ (Girls' Frontline), containing 334 images and their tags.
The core tags of this character are `blonde_hair, green_eyes, short_hair, breasts, twintails, bangs, braid, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 334 | 388.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/welrod_mkii_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 334 | 236.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/welrod_mkii_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 814 | 501.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/welrod_mkii_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 334 | 345.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/welrod_mkii_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 814 | 668.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/welrod_mkii_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/welrod_mkii_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, solo, black_gloves, looking_at_viewer, blush, white_shirt, fang, open_mouth, smile, vampire_costume, long_sleeves, alternate_costume, cape, leotard, thigh_holster, wings, long_hair, bat_(animal), brown_footwear, handgun, holding_cup, o-ring, wine_glass, halloween_costume, high_heel_boots, low_ponytail, shiny, simple_background, thighs, wristband |
| 1 | 25 |  |  |  |  |  | 1girl, corset, holding_gun, pinstripe_pattern, skirt, solo, necktie, shirt, vest, looking_at_viewer, black_gloves, handgun, jacket_on_shoulders, simple_background, thigh_holster, white_background, socks, belt, dual_wielding |
| 2 | 5 |  |  |  |  |  | 1girl, black_gloves, formal, simple_background, solo, suit, black_jacket, holding_gun, pink_background, red_necktie, upper_body, white_shirt, looking_at_viewer, collared_shirt, handgun, open_clothes |
| 3 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, pom_pom_(clothes), solo, gift_box, heart-shaped_box, long_sleeves, open_cardigan, argyle, fur_trim, holding_gift, open_coat, ribbon, white_shirt, collared_shirt, hood, simple_background, striped_bow, striped_skirt, blue_skirt, buttons, jacket_on_shoulders, open_mouth, sidelocks, sleeves_past_wrists, thigh_holster, valentine, white_background |
| 4 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, solo, navel, blue_sky, blush, day, outdoors, closed_mouth, cloud, collarbone, bare_shoulders, beach, cleavage, food, jewelry, ocean, standing, water, white_bikini |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, open_mouth, solo_focus, nipples, penis, sex, vaginal, bar_censor, dark-skinned_male, navel, collarbone, completely_nude, cowgirl_position, cum, girl_on_top, heart, pussy, tongue |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | black_gloves | looking_at_viewer | blush | white_shirt | fang | open_mouth | smile | vampire_costume | long_sleeves | alternate_costume | cape | leotard | thigh_holster | wings | long_hair | bat_(animal) | brown_footwear | handgun | holding_cup | o-ring | wine_glass | halloween_costume | high_heel_boots | low_ponytail | shiny | simple_background | thighs | wristband | corset | holding_gun | pinstripe_pattern | skirt | necktie | shirt | vest | jacket_on_shoulders | white_background | socks | belt | dual_wielding | formal | suit | black_jacket | pink_background | red_necktie | upper_body | collared_shirt | open_clothes | pom_pom_(clothes) | gift_box | heart-shaped_box | open_cardigan | argyle | fur_trim | holding_gift | open_coat | ribbon | hood | striped_bow | striped_skirt | blue_skirt | buttons | sidelocks | sleeves_past_wrists | valentine | navel | blue_sky | day | outdoors | closed_mouth | cloud | collarbone | bare_shoulders | beach | cleavage | food | jewelry | ocean | standing | water | white_bikini | 1boy | hetero | solo_focus | nipples | penis | sex | vaginal | bar_censor | dark-skinned_male | completely_nude | cowgirl_position | cum | girl_on_top | heart | pussy | tongue |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------------|:--------|:--------------|:-------|:-------------|:--------|:------------------|:---------------|:--------------------|:-------|:----------|:----------------|:--------|:------------|:---------------|:-----------------|:----------|:--------------|:---------|:-------------|:--------------------|:------------------|:---------------|:--------|:--------------------|:---------|:------------|:---------|:--------------|:--------------------|:--------|:----------|:--------|:-------|:----------------------|:-------------------|:--------|:-------|:----------------|:---------|:-------|:---------------|:------------------|:--------------|:-------------|:-----------------|:---------------|:--------------------|:-----------|:-------------------|:----------------|:---------|:-----------|:---------------|:------------|:---------|:-------|:--------------|:----------------|:-------------|:----------|:------------|:----------------------|:------------|:--------|:-----------|:------|:-----------|:---------------|:--------|:-------------|:-----------------|:--------|:-----------|:-------|:----------|:--------|:-----------|:--------|:---------------|:-------|:---------|:-------------|:----------|:--------|:------|:----------|:-------------|:--------------------|:------------------|:-------------------|:------|:--------------|:--------|:--------|:---------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | X | X | X | | | | | | | | | | | X | | | | | X | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | X | | | | | | | | X | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | X | X | X | | X | | | X | | | | X | | | | | | | | | | | | | X | | | | | | | | | | X | X | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_leejaymin__etri-ones-solar | ---
pretty_name: Evaluation run of leejaymin/etri-ones-solar
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [leejaymin/etri-ones-solar](https://huggingface.co/leejaymin/etri-ones-solar)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_leejaymin__etri-ones-solar\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T04:21:29.072585](https://huggingface.co/datasets/open-llm-leaderboard/details_leejaymin__etri-ones-solar/blob/main/results_2024-04-03T04-21-29.072585.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6289828180315297,\n\
\ \"acc_stderr\": 0.03199524578851419,\n \"acc_norm\": 0.6403658725397634,\n\
\ \"acc_norm_stderr\": 0.03287306734304755,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720113,\n \"mc2\": 0.5269096033669528,\n\
\ \"mc2_stderr\": 0.014940096900050902\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221007,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759091\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6502688707428799,\n\
\ \"acc_stderr\": 0.004759103432380765,\n \"acc_norm\": 0.8430591515634336,\n\
\ \"acc_norm_stderr\": 0.0036300159898964056\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\"\
: 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\"\
: 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \
\ \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.035834961763610736,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.035834961763610736\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644833,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4576719576719577,\n \"acc_stderr\": 0.025658868862058336,\n \"\
acc_norm\": 0.4576719576719577,\n \"acc_norm_stderr\": 0.025658868862058336\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.02479011845933221,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.02479011845933221\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8282828282828283,\n \"acc_stderr\": 0.026869716187429903,\n \"\
acc_norm\": 0.8282828282828283,\n \"acc_norm_stderr\": 0.026869716187429903\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.02475600038213095,\n \
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.02475600038213095\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.02986960509531691,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.02986960509531691\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530333,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530333\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8438818565400844,\n \"acc_stderr\": 0.023627159460318674,\n \
\ \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.023627159460318674\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381394,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381394\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.0247524119609172,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.0247524119609172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n\
\ \"acc_stderr\": 0.015461169002371537,\n \"acc_norm\": 0.3094972067039106,\n\
\ \"acc_norm_stderr\": 0.015461169002371537\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.02456922360046086,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.02456922360046086\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49869621903520206,\n\
\ \"acc_stderr\": 0.012770192691057107,\n \"acc_norm\": 0.49869621903520206,\n\
\ \"acc_norm_stderr\": 0.012770192691057107\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720113,\n \"mc2\": 0.5269096033669528,\n\
\ \"mc2_stderr\": 0.014940096900050902\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613981\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/leejaymin/etri-ones-solar
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|arc:challenge|25_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|gsm8k|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hellaswag|10_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T04-21-29.072585.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T04-21-29.072585.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- '**/details_harness|winogrande|5_2024-04-03T04-21-29.072585.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T04-21-29.072585.parquet'
- config_name: results
data_files:
- split: 2024_04_03T04_21_29.072585
path:
- results_2024-04-03T04-21-29.072585.parquet
- split: latest
path:
- results_2024-04-03T04-21-29.072585.parquet
---
# Dataset Card for Evaluation run of leejaymin/etri-ones-solar
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [leejaymin/etri-ones-solar](https://huggingface.co/leejaymin/etri-ones-solar) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_leejaymin__etri-ones-solar",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T04:21:29.072585](https://huggingface.co/datasets/open-llm-leaderboard/details_leejaymin__etri-ones-solar/blob/main/results_2024-04-03T04-21-29.072585.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6289828180315297,
"acc_stderr": 0.03199524578851419,
"acc_norm": 0.6403658725397634,
"acc_norm_stderr": 0.03287306734304755,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720113,
"mc2": 0.5269096033669528,
"mc2_stderr": 0.014940096900050902
},
"harness|arc:challenge|25": {
"acc": 0.5861774744027304,
"acc_stderr": 0.014392730009221007,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759091
},
"harness|hellaswag|10": {
"acc": 0.6502688707428799,
"acc_stderr": 0.004759103432380765,
"acc_norm": 0.8430591515634336,
"acc_norm_stderr": 0.0036300159898964056
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.035834961763610736,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.035834961763610736
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.029067220146644833,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.029067220146644833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4576719576719577,
"acc_stderr": 0.025658868862058336,
"acc_norm": 0.4576719576719577,
"acc_norm_stderr": 0.025658868862058336
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.02479011845933221,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.02479011845933221
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8282828282828283,
"acc_stderr": 0.026869716187429903,
"acc_norm": 0.8282828282828283,
"acc_norm_stderr": 0.026869716187429903
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.02475600038213095,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.02475600038213095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.02986960509531691,
"acc_norm": 0.4,
"acc_norm_stderr": 0.02986960509531691
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530333,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8438818565400844,
"acc_stderr": 0.023627159460318674,
"acc_norm": 0.8438818565400844,
"acc_norm_stderr": 0.023627159460318674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381394,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.0247524119609172,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.0247524119609172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371537,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371537
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.02456922360046086,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.02456922360046086
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49869621903520206,
"acc_stderr": 0.012770192691057107,
"acc_norm": 0.49869621903520206,
"acc_norm_stderr": 0.012770192691057107
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720113,
"mc2": 0.5269096033669528,
"mc2_stderr": 0.014940096900050902
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613981
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
JeremiahZ/mbxp_wasm_no_funcname | ---
dataset_info:
features:
- name: task_id
dtype: string
- name: language
dtype: string
- name: prompt
dtype: string
- name: description
dtype: string
- name: test
dtype: string
- name: entry_point
dtype: string
- name: canonical_solution
dtype: string
- name: wat
dtype: string
splits:
- name: train
num_bytes: 3916582
num_examples: 773
download_size: 956941
dataset_size: 3916582
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mbxp_wasm_no_funcname"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MohammedNasri/Denoised_data_jason3 | ---
dataset_info:
features:
- name: data
struct:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: 'null'
- name: sampling_rate
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1113923195
num_examples: 2000
download_size: 275899919
dataset_size: 1113923195
---
# Dataset Card for "Denoised_data_jason3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Samvardhan777/kde4-German-to-English | ---
dataset_info:
features:
- name: formatted_text
dtype: string
splits:
- name: train
num_bytes: 33041898
num_examples: 224035
download_size: 13362991
dataset_size: 33041898
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adowu/polish_sentences | ---
license: mit
language:
- pl
task_categories:
- conversational
size_categories:
- 100K<n<1M
--- |
Danielfu17/chunked_guanzhangtone | ---
license: unknown
---
|
mirfan899/hindi-ner | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': LOCATION
'1': BRAND
'2': TITLE_OBJECT
'3': PERSON
'4': DESIGNATION
'5': ORGANIZATION
'6': ABBREVIATION
'7': TIME
'8': NUMBER
'9': MEASURE
'10': TERMS
'11': O
splits:
- name: train
num_bytes: 22988092
num_examples: 18376
- name: validation
num_bytes: 9784310
num_examples: 7876
- name: test
num_bytes: 9784310
num_examples: 7876
download_size: 6072695
dataset_size: 42556712
---
# Dataset Card for "hindi-ner"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NorGLM/NO-CNN-DailyMail | ---
license: cc-by-nc-sa-4.0
language:
- 'no'
---
# Dataset Card
## Dataset Summary
NO-CNN-DailyMail is a Norwegian news summarization dataset partially machine translated from English version of [CNN Dailymail Dataset](https://huggingface.co/datasets/cnn_dailymail?row=34). The summaries were written by journalists at CNN and the DailyMail. The dataset can be used for Machine reading comprehension and abstractive summarization tasks.
## Data Instances
For each instance, there is an *article* string and a *positive_sample* string representing news article and abstractive summary to this article.
## Data Split
The dataset is split into train and test sets.
| | #samples |
|-------|---------------------|
| train | 61181 |
| test | 15287 |
For English version of this dataset, please refer to [link](https://paperswithcode.com/dataset/cnn-daily-mail-1)
## Citation Information
Please cite original CNN/Daily Mail dataset:
```
@article{nallapati2016abstractive,
title={Abstractive text summarization using sequence-to-sequence rnns and beyond},
author={Nallapati, Ramesh and Zhou, Bowen and Gulcehre, Caglar and Xiang, Bing and others},
journal={arXiv preprint arXiv:1602.06023},
year={2016}
}
```
|
Nexdata/300_Person_Mandarin_Chinese_and_English_Bilingual_Spontaneous_Monologue_smartphone | ---
license: cc-by-nc-nd-4.0
---
## Description
Mandarin Chinese and English Bilingual Spotaneous Monologue Smartphone speech dataset, collected from dialogues based on given topics, covering generic domain. Our dataset was collected from extensive and diversify speakers(300 people in total, ages 18 to 65), geographicly speaking, enhancing model performance in real and complex tasks. Quality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1358?source=Huggingface
## Format
16kHz, 16 bit, wav, mono channel
## Content category
Individuals naturally speaking, with no specific content limitations. Each speaker records 20 audios in each language (40 recordings per person), each recording lasting about 10-20 seconds
## Recording condition
Quiet indoor environment, without echoes, background voices, obvious noises
## Recording device
Android phone
## Speaker
Total 300 contributors,40% males and 60% females. 83%contributors aged 18-37, 15% contributors aged 38-45, and 2% contributors aged 46-65
## Country
China(CHN);
## Language
Mandarin Chinese, English;
# Licensing Information
Commercial License
|
open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B | ---
pretty_name: Evaluation run of Sao10K/Stheno-1.2-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Stheno-1.2-L2-13B](https://huggingface.co/Sao10K/Stheno-1.2-L2-13B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T06:49:47.166294](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B/blob/main/results_2023-10-29T06-49-47.166294.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.397126677852349,\n\
\ \"em_stderr\": 0.005010917075875424,\n \"f1\": 0.4671539429530222,\n\
\ \"f1_stderr\": 0.0047944933216487965,\n \"acc\": 0.42948814994019174,\n\
\ \"acc_stderr\": 0.01038154947148015\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.397126677852349,\n \"em_stderr\": 0.005010917075875424,\n\
\ \"f1\": 0.4671539429530222,\n \"f1_stderr\": 0.0047944933216487965\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10917361637604246,\n \
\ \"acc_stderr\": 0.008590089300511146\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.749802683504341,\n \"acc_stderr\": 0.012173009642449155\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Stheno-1.2-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|arc:challenge|25_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T06_49_47.166294
path:
- '**/details_harness|drop|3_2023-10-29T06-49-47.166294.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T06-49-47.166294.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T06_49_47.166294
path:
- '**/details_harness|gsm8k|5_2023-10-29T06-49-47.166294.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T06-49-47.166294.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hellaswag|10_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T06-46-37.023580.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T06-46-37.023580.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T06-46-37.023580.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T06_49_47.166294
path:
- '**/details_harness|winogrande|5_2023-10-29T06-49-47.166294.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T06-49-47.166294.parquet'
- config_name: results
data_files:
- split: 2023_09_12T06_46_37.023580
path:
- results_2023-09-12T06-46-37.023580.parquet
- split: 2023_10_29T06_49_47.166294
path:
- results_2023-10-29T06-49-47.166294.parquet
- split: latest
path:
- results_2023-10-29T06-49-47.166294.parquet
---
# Dataset Card for Evaluation run of Sao10K/Stheno-1.2-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Stheno-1.2-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Stheno-1.2-L2-13B](https://huggingface.co/Sao10K/Stheno-1.2-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T06:49:47.166294](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B/blob/main/results_2023-10-29T06-49-47.166294.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.397126677852349,
"em_stderr": 0.005010917075875424,
"f1": 0.4671539429530222,
"f1_stderr": 0.0047944933216487965,
"acc": 0.42948814994019174,
"acc_stderr": 0.01038154947148015
},
"harness|drop|3": {
"em": 0.397126677852349,
"em_stderr": 0.005010917075875424,
"f1": 0.4671539429530222,
"f1_stderr": 0.0047944933216487965
},
"harness|gsm8k|5": {
"acc": 0.10917361637604246,
"acc_stderr": 0.008590089300511146
},
"harness|winogrande|5": {
"acc": 0.749802683504341,
"acc_stderr": 0.012173009642449155
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
cfierro/updates_data | ---
dataset_info:
features:
- name: query
struct:
- name: label
dtype: string
- name: objects
list:
- name: label
dtype: string
- name: qid
dtype: string
- name: qid
dtype: string
- name: rel_id
dtype: string
- name: relation
dtype: string
- name: prediction
struct:
- name: predictions
list:
- name: answer
dtype: string
- name: first_token_probability
dtype: float64
- name: per_token_probability
sequence: float64
- name: perplexity
dtype: float64
- name: query
dtype: string
- name: relation
dtype: string
- name: updates
sequence: string
splits:
- name: train
num_bytes: 1456957
num_examples: 5081
download_size: 602132
dataset_size: 1456957
---
# Dataset Card for "updates_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kuanhuggingface/tencent_tts_speech_tokenizer | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: file_id
dtype: string
- name: instruction
dtype: string
- name: transcription
dtype: string
- name: src_speech_tokenizer_0
sequence: int64
- name: src_speech_tokenizer_1
sequence: int64
- name: src_speech_tokenizer_2
sequence: int64
- name: src_speech_tokenizer_3
sequence: int64
- name: src_speech_tokenizer_4
sequence: int64
- name: src_speech_tokenizer_5
sequence: int64
- name: src_speech_tokenizer_6
sequence: int64
- name: src_speech_tokenizer_7
sequence: int64
- name: tgt_speech_tokenizer_0
sequence: int64
- name: tgt_speech_tokenizer_1
sequence: int64
- name: tgt_speech_tokenizer_2
sequence: int64
- name: tgt_speech_tokenizer_3
sequence: int64
- name: tgt_speech_tokenizer_4
sequence: int64
- name: tgt_speech_tokenizer_5
sequence: int64
- name: tgt_speech_tokenizer_6
sequence: int64
- name: tgt_speech_tokenizer_7
sequence: int64
splits:
- name: train
num_bytes: 12405025340
num_examples: 266780
- name: validation
num_bytes: 352337364
num_examples: 7620
- name: test
num_bytes: 339358908
num_examples: 7620
download_size: 707880738
dataset_size: 13096721612
---
# Dataset Card for "tencent_tts_speech_tokenizer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigcode/stack-dedup-alt-comments | ---
annotations_creators: []
language_creators:
- crowdsourced
language: ["code"]
multilinguality:
- multilingual
size_categories:
- unknown
source_datasets: []
task_categories:
- text-generation
task_ids:
- language-modeling
extra_gated_prompt: >-
## Terms of Use for The Stack
The Stack dataset is a collection of source code in over 300 programming
languages. We ask that you read and acknowledge the following points before
using the dataset:
1. The Stack is a collection of source code from repositories with various
licenses. Any use of all or part of the code gathered in The Stack must abide
by the terms of the original licenses, including attribution clauses when
relevant. We facilitate this by providing provenance information for each data
point.
2. The Stack is regularly updated to enact validated data removal requests. By
clicking on "Access repository", you agree to update your own version of The
Stack to the most recent usable version specified by the maintainers in [the
following
thread](https://huggingface.co/datasets/bigcode/the-stack/discussions/7). If
you have questions about dataset versions and allowed uses, please also ask
them in the dataset’s [community
discussions](https://huggingface.co/datasets/bigcode/the-stack/discussions/new).
We will also notify users via email when the latest usable version changes.
3. To host, share, or otherwise provide access to The Stack dataset, you must
include [these Terms of
Use](https://huggingface.co/datasets/bigcode/the-stack#terms-of-use-for-the-stack)
and require users to agree to it.
By clicking on "Access repository" below, you accept that your contact
information (email address and username) can be shared with the dataset
maintainers as well.
extra_gated_fields:
Email: text
I have read the License and agree with its terms: checkbox
---
## Dataset Description
This is the Python, Java and JavaScript subsets of The Stack (v1.1) after cleaning* and agressive deduplication from [stack-dedup-alt-decontaminate](https://huggingface.co/datasets/bigcode/stack-dedup-alt-decontaminate) with
filtering on [comment to code ratio](https://github.com/bigcode-project/bigcode-dataset/tree/main/preprocessing) with minimum of 0.01 and maximum of 0.8.
The additional comments filtering removes 26.5% of the dataset's volume which goes from 215GB of text to 170GB.
(*) cleaning: near deduplication + PII redaction + line length & percentage of alphanumeric characters filtering + data decontamination
|
k0ntra/gesti | ---
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
splits:
- name: train
num_bytes: 9216
num_examples: 3
download_size: 329968
dataset_size: 9216
---
# Dataset Card for "gesti"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zh-tw-llm-dv/zh-tw-pythia-ta8000-v1-e1-tr_sg-301-c1024-sbldt3 | ---
dataset_info:
dataset_size: 54222354.81927678
download_size: 15275408
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- dtype: string
name: preview
- dtype: int64
name: length
splits:
- name: train
num_bytes: 53306765.02927678
num_examples: 6155
- name: test
num_bytes: 915589.79
num_examples: 97
---
# zh-tw-pythia-ta8000-v1-e1-tr_sg-301-c1024-sbldt3
This dataset is a part of the `zh-tw-llm` project.
* Tokenizer: `zh-tw-pythia-tokenizer-a8000-v1`
* Built with: `sharegpt`
* Rows: `train` `6155`, `test` `97`
* Max length: `1024`
* Full config:
```json
{"build_with": ["sharegpt"], "preview_length": 128, "sort_by": "length-desc", "translations_settings": {"source_dataset": "zetavg/coct-en-zh-tw-translations-twp-300k", "lang_1_key": "en", "lang_2_key": "ch", "templates": ["English: {lang_1}\nChinese: {lang_2}", "Chinese: {lang_2}\nEnglish: {lang_1}"], "use_template": "random", "rows_limit": 300000, "test_size": 100, "test_split_seed": 42}, "sharegpt_settings": {"source_dataset": "zetavg/ShareGPT-Processed", "train_on_inputs": false, "languages": [{"en": 0.4}, "zh_Hant"], "rows_limit": 8000, "test_size": 0.02, "test_split_seed": 42, "test_rows_limit": 100}}
``` |
CosthanzoCloro/polarizellama | ---
license: apache-2.0
---
|
ConvLab/tm3 | ---
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Taskmaster-3
size_categories:
- 10K<n<100K
task_categories:
- conversational
---
# Dataset Card for Taskmaster-3
- **Repository:** https://github.com/google-research-datasets/Taskmaster/tree/master/TM-3-2020
- **Paper:** https://aclanthology.org/2021.acl-long.55.pdf
- **Leaderboard:** None
- **Who transforms the dataset:** Qi Zhu(zhuq96 at gmail dot com)
To use this dataset, you need to install [ConvLab-3](https://github.com/ConvLab/ConvLab-3) platform first. Then you can load the dataset via:
```
from convlab.util import load_dataset, load_ontology, load_database
dataset = load_dataset('tm3')
ontology = load_ontology('tm3')
database = load_database('tm3')
```
For more usage please refer to [here](https://github.com/ConvLab/ConvLab-3/tree/master/data/unified_datasets).
### Dataset Summary
The Taskmaster-3 (aka TicketTalk) dataset consists of 23,789 movie ticketing dialogs (located in Taskmaster/TM-3-2020/data/). By "movie ticketing" we mean conversations where the customer's goal is to purchase tickets after deciding on theater, time, movie name, number of tickets, and date, or opt out of the transaction.
This collection was created using the "self-dialog" method. This means a single, crowd-sourced worker is paid to create a conversation writing turns for both speakers, i.e. the customer and the ticketing agent. In order to gather a wide range of conversational scenarios and linguistic phenomena, workers were given both open-ended as well as highly structured conversational tasks. In all, we used over three dozen sets of instructions while building this corpus. The "instructions" field in data.json provides the exact scenario workers were given to complete each dialog. In this way, conversations involve a wide variety of paths, from those where the customer decides on a movie based on genre, their location, current releases, or from what they already have in mind. In addition, dialogs also include error handling with repect to repair (e.g. "No, I said Tom Cruise."), clarifications (e.g. "Sorry. Did you want the AMC 16 or Century City 16?") and other common conversational hiccups. In some cases instructions are completely open ended e.g. "Pretend you are taking your friend to a movie in Salem, Oregon. Create a conversation where you end up buying two tickets after finding out what is playing in at least two local theaters. Make sure the ticket purchase includes a confirmation of the deatils by the agent before the purchase, including date, time, movie, theater, and number of tickets." In other cases we restrict the conversational content and structure by offering a partially completed conversation that the workers must finalize or fill in based a certain parameters. These partially completed dialogs are labeled "Auto template" in the "scenario" field shown for each conversation in the data.json file. In some cases, we provided a small KB from which workers would choose movies, theaters, etc. but in most cases (pre-pandemic) workers were told to use the internet to get accurate current details for their dialogs. In any case, all relevant entities are annotated.
- **How to get the transformed data from original data:**
- Download [master.zip](https://github.com/google-research-datasets/Taskmaster/archive/refs/heads/master.zip).
- Run `python preprocess.py` in the current directory.
- **Main changes of the transformation:**
- Remove dialogs that are empty or only contain one speaker.
- Split each domain dialogs into train/validation/test randomly (8:1:1).
- Merge continuous turns by the same speaker (ignore repeated turns).
- Annotate `dialogue acts` according to the original segment annotations. Add `intent` annotation (`==inform`). The type of `dialogue act` is set to `non-categorical` if the `slot` is not `description.other` or `description.plot`. Otherwise, the type is set to `binary` (and the `value` is empty). If there are multiple spans overlapping, we only keep the shortest one, since we found that this simple strategy can reduce the noise in annotation.
- Add `domain` and `intent` descriptions.
- Rename `api` to `db_results`.
- Add `state` by accumulate `non-categorical dialogue acts` in the order that they appear.
- **Annotations:**
- dialogue acts, state, db_results.
### Supported Tasks and Leaderboards
NLU, DST, Policy, NLG, E2E
### Languages
English
### Data Splits
| split | dialogues | utterances | avg_utt | avg_tokens | avg_domains | cat slot match(state) | cat slot match(goal) | cat slot match(dialogue act) | non-cat slot span(dialogue act) |
|------------|-------------|--------------|-----------|--------------|---------------|-------------------------|------------------------|--------------------------------|-----------------------------------|
| train | 18997 | 380646 | 20.04 | 10.48 | 1 | - | - | - | 100 |
| validation | 2380 | 47531 | 19.97 | 10.38 | 1 | - | - | - | 100 |
| test | 2380 | 48849 | 20.52 | 10.12 | 1 | - | - | - | 100 |
| all | 23757 | 477026 | 20.08 | 10.43 | 1 | - | - | - | 100 |
1 domains: ['movie']
- **cat slot match**: how many values of categorical slots are in the possible values of ontology in percentage.
- **non-cat slot span**: how many values of non-categorical slots have span annotation in percentage.
### Citation
```
@inproceedings{byrne-etal-2021-tickettalk,
title = "{T}icket{T}alk: Toward human-level performance with end-to-end, transaction-based dialog systems",
author = "Byrne, Bill and
Krishnamoorthi, Karthik and
Ganesh, Saravanan and
Kale, Mihir",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.55",
doi = "10.18653/v1/2021.acl-long.55",
pages = "671--680",
}
```
### Licensing Information
[**CC BY 4.0**](https://creativecommons.org/licenses/by/4.0/) |
Madhubala/Actor | ---
license: apache-2.0
---
|
ksukrit/annotated_hands_good_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1354244079.1
num_examples: 1150
download_size: 1318885773
dataset_size: 1354244079.1
---
# Dataset Card for "annotated_hands_good_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kamilakesbi/cv_for_spd_fr_augmented_2k | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: speakers
sequence: string
- name: timestamps_start
sequence: float64
- name: timestamps_end
sequence: float64
splits:
- name: train
num_bytes: 3462400544.0
num_examples: 2016
- name: validation
num_bytes: 772111334.0
num_examples: 408
- name: test
num_bytes: 773731278.0
num_examples: 408
download_size: 4669764120
dataset_size: 5008243156.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
SauravMaheshkar/pareto-amazon-photo | ---
size_categories:
- 1K<n<10K
task_categories:
- graph-ml
license: cc
---
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:-------:|:----------:|
| 7,650 | 119,043 | 745 |
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
``` |
mask-distilled-one-sec-cv12/chunk_95 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1249113428
num_examples: 245309
download_size: 1273890922
dataset_size: 1249113428
---
# Dataset Card for "chunk_95"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
isaiasnavarro/minedatapromot | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1032094
num_examples: 1846
download_size: 371773
dataset_size: 1032094
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jslin09/LegalElements | ---
license: mit
---
|
tombailey/oasst1-ja | ---
license: apache-2.0
language:
- ja
size_categories:
- n<1K
---
# oasst1-ja
## Description
Based on [OpenAssistant Conversations Dataset (OASST1)](https://huggingface.co/datasets/OpenAssistant/oasst1) but only the messages labeled as Japanese.
### Structure
The format is changed to `### Human: ...### Assistant: ...`.
Each row of the text file contains a single human message and the assistant's reply. This means a single row may be missing context from messages earlier in the conversation. |
open-llm-leaderboard/details_invalid-coder__SOLAR-10.7B-Instruct-SOLARC-M-10.7B-slerp | ---
pretty_name: Evaluation run of invalid-coder/SOLAR-10.7B-Instruct-SOLARC-M-10.7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [invalid-coder/SOLAR-10.7B-Instruct-SOLARC-M-10.7B-slerp](https://huggingface.co/invalid-coder/SOLAR-10.7B-Instruct-SOLARC-M-10.7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_invalid-coder__SOLAR-10.7B-Instruct-SOLARC-M-10.7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T12:41:00.359909](https://huggingface.co/datasets/open-llm-leaderboard/details_invalid-coder__SOLAR-10.7B-Instruct-SOLARC-M-10.7B-slerp/blob/main/results_2024-02-02T12-41-00.359909.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6666138651793191,\n\
\ \"acc_stderr\": 0.031637513501573004,\n \"acc_norm\": 0.6674256198470557,\n\
\ \"acc_norm_stderr\": 0.03228287649895947,\n \"mc1\": 0.5716034271725826,\n\
\ \"mc1_stderr\": 0.017323088597314747,\n \"mc2\": 0.7173010784992183,\n\
\ \"mc2_stderr\": 0.015025206960130984\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6825938566552902,\n \"acc_stderr\": 0.013602239088038167,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.01325001257939344\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7120095598486357,\n\
\ \"acc_stderr\": 0.004519011688417164,\n \"acc_norm\": 0.8833897629954193,\n\
\ \"acc_norm_stderr\": 0.0032029933469910634\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.02573364199183898,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.02573364199183898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n\
\ \"acc_stderr\": 0.021732540689329286,\n \"acc_norm\": 0.8225806451612904,\n\
\ \"acc_norm_stderr\": 0.021732540689329286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955286,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955286\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136088,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643527,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643527\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n\
\ \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38994413407821227,\n\
\ \"acc_stderr\": 0.01631237662921307,\n \"acc_norm\": 0.38994413407821227,\n\
\ \"acc_norm_stderr\": 0.01631237662921307\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n\
\ \"acc_stderr\": 0.025122637608816643,\n \"acc_norm\": 0.7331189710610932,\n\
\ \"acc_norm_stderr\": 0.025122637608816643\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0227797190887334,\n\
\ \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0227797190887334\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n\
\ \"acc_stderr\": 0.012768922739553308,\n \"acc_norm\": 0.49282920469361147,\n\
\ \"acc_norm_stderr\": 0.012768922739553308\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468712,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468712\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5716034271725826,\n\
\ \"mc1_stderr\": 0.017323088597314747,\n \"mc2\": 0.7173010784992183,\n\
\ \"mc2_stderr\": 0.015025206960130984\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343331\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6474601971190296,\n \
\ \"acc_stderr\": 0.013159909755930333\n }\n}\n```"
repo_url: https://huggingface.co/invalid-coder/SOLAR-10.7B-Instruct-SOLARC-M-10.7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|arc:challenge|25_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|gsm8k|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hellaswag|10_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T12-41-00.359909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T12-41-00.359909.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- '**/details_harness|winogrande|5_2024-02-02T12-41-00.359909.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T12-41-00.359909.parquet'
- config_name: results
data_files:
- split: 2024_02_02T12_41_00.359909
path:
- results_2024-02-02T12-41-00.359909.parquet
- split: latest
path:
- results_2024-02-02T12-41-00.359909.parquet
---
# Dataset Card for Evaluation run of invalid-coder/SOLAR-10.7B-Instruct-SOLARC-M-10.7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [invalid-coder/SOLAR-10.7B-Instruct-SOLARC-M-10.7B-slerp](https://huggingface.co/invalid-coder/SOLAR-10.7B-Instruct-SOLARC-M-10.7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_invalid-coder__SOLAR-10.7B-Instruct-SOLARC-M-10.7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T12:41:00.359909](https://huggingface.co/datasets/open-llm-leaderboard/details_invalid-coder__SOLAR-10.7B-Instruct-SOLARC-M-10.7B-slerp/blob/main/results_2024-02-02T12-41-00.359909.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6666138651793191,
"acc_stderr": 0.031637513501573004,
"acc_norm": 0.6674256198470557,
"acc_norm_stderr": 0.03228287649895947,
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314747,
"mc2": 0.7173010784992183,
"mc2_stderr": 0.015025206960130984
},
"harness|arc:challenge|25": {
"acc": 0.6825938566552902,
"acc_stderr": 0.013602239088038167,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.01325001257939344
},
"harness|hellaswag|10": {
"acc": 0.7120095598486357,
"acc_stderr": 0.004519011688417164,
"acc_norm": 0.8833897629954193,
"acc_norm_stderr": 0.0032029933469910634
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.02573364199183898,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.02573364199183898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955286,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136088,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643527,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643527
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38994413407821227,
"acc_stderr": 0.01631237662921307,
"acc_norm": 0.38994413407821227,
"acc_norm_stderr": 0.01631237662921307
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.025122637608816643,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.025122637608816643
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0227797190887334,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0227797190887334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49282920469361147,
"acc_stderr": 0.012768922739553308,
"acc_norm": 0.49282920469361147,
"acc_norm_stderr": 0.012768922739553308
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.018850084696468712,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.018850084696468712
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314747,
"mc2": 0.7173010784992183,
"mc2_stderr": 0.015025206960130984
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343331
},
"harness|gsm8k|5": {
"acc": 0.6474601971190296,
"acc_stderr": 0.013159909755930333
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
erickrribeiro/spell_correction_datasets_pt_br | ---
license: mit
---
|
vietgpt-archive/test-sort | ---
dataset_info:
features:
- name: url
dtype: string
- name: text
dtype: string
- name: perplexity
dtype: float64
- name: num_char
dtype: string
- name: num_word
dtype: string
splits:
- name: train
num_bytes: 11086487900
num_examples: 497070
download_size: 5145818064
dataset_size: 11086487900
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BI55/MedText | ---
license: cc-by-4.0
---
This is the shuffled version of medtext_1, so the datapoints are in random order and not sorted by category. This is to prevent catastrophic forgetting by category.
This is a medical diagnosis dataset containing over 1000 top notch textbook quality patient presentations and diagnosis/treatments. The 100 most common diseases and the 30 most common injuries people go to the hospital with, are, among others, fully captured in the dataset, with multiple datapoints for each ranging from mild to complicated to severe. Full list below. The dataset also contains completions about the nature of the AI itself, that it never can replace a doctor and always emphasizes to go to a professional and some nonsensical or doubtful presentations. A model trained on this dataset explicitly tells when it CANNOT answer with confidence or if the presentation is insufficient. This is to prevent hallucinations.
Medtext is a free to use (CC BY 4.0) dataset of over 1000 patient presentations and their diagnosis/treatment plans.
This is original data, converted into uniform datapoints using GPT-4.
We then pulled 10 random examples of the dataset and showed them to 3 different doctors, 2 of them involved and 1 of them uninvolved, and they all categorize the quality as „textbook quality“.
It’s content includes:
NOISE/DATA POLLUTION
*Dismissing of non-medical or non-psychological issues
*specifically asking for more information / admitting no possible diagnosis with confidence if insufficient data
*conflicting/contradicting and irrelevant information
*cases where symptoms are misleading to seemingly obvious diagnosis but actually being something different
*information about the model (What are you? What can you do? Are you able to replace a doctor? This is to make the model humble and always emphasize that it can never replace a professional and it is just there to do some substitute analysis)
MISC
*emergency cases / first aid / almost fatal njuries that require emergency surgery
*injuries from crimes
*sexual injuries and STDs
*Infant specific cases
*Gynecological and urological cases
*genetic anomalies
*Previous medical mishandling
*Abuse/Overdosing/Misuse of drugs
*Cross side effects of drugs
ANALYSIS
*Textual analysis of blood tests, ultrasound, CT, MRI and X-ray examinations.
INJURIES:
* Sprains and strains
* Fractures
* Contusions (bruises)
* Cuts and lacerations
* Concussions
* Burns
* Dislocations
* Abrasions (scrapes)
* Whiplash injuries
* Eye injuries
* Puncture wounds
* Bites and stings
* Back injuries
* Broken nose
* Knee injuries
* Ankle injuries
* Shoulder injuries
* Wrist injuries
* Chest injuries
* Head injuries
DISEASES:
* Acne
* Allergies
* Alzheimer's Disease
* Anemia
* Angina
* Anxiety Disorders
* Arthritis
* Asthma
* Atherosclerosis
* Athlete's Foot
* Attention Deficit Hyperactivity Disorder (ADHD)
* Autism Spectrum Disorder
* Back Pain
* Bipolar Disorder
* Bronchitis
* Cataracts
* Chickenpox
* Chronic Obstructive Pulmonary Disease (COPD)
* Common Cold
* Conjunctivitis (Pink Eye)
* Constipation
* Coronary Heart Disease
* Cystitis
* Dementia
* Depression
* Diabetes Type 1
* Diabetes Type 2
* Diarrhea
* Diverticulitis
* Dizziness (Vertigo)
* Ear Infections
* Eczema
* Endometriosis
* Erectile Dysfunction
* Fibromyalgia
* Flu (Influenza)
* Food Poisoning
* Gallstones
* Gastroenteritis
* Gastroesophageal Reflux Disease (GERD)
* Gout
* Hay Fever (Allergic Rhinitis)
* Headaches
* Heart Failure
* Hemorrhoids
* Hepatitis B
* Hepatitis C
* Herpes Simplex Virus (HSV)
* High Blood Pressure (Hypertension)
* High Cholesterol (Hypercholesterolemia)
* HIV/AIDS
* Hyperthyroidism (Overactive Thyroid)
* Hypothyroidism (Underactive Thyroid)
* Inflammatory Bowel Disease (Including Crohn's and Ulcerative Colitis)
* Insomnia
* Iron Deficiency Anemia
* Irritable Bowel Syndrome (IBS)
* Kidney Stones
* Lactose Intolerance
* Lyme Disease
* Macular Degeneration
* Malaria
* Menopause
* Migraine
* Multiple Sclerosis
* Obesity
* Osteoarthritis
* Osteoporosis
* Otitis Media (Middle Ear Infection)
* Pancreatitis
* Parkinson's Disease
* Peptic Ulcers
* Periodontal Disease
* Pneumonia
* Polycystic Ovary Syndrome (PCOS)
* Prostate Enlargement (Benign Prostatic Hyperplasia)
* Psoriasis
* Pulmonary Embolism
* Restless Legs Syndrome
* Rheumatoid Arthritis
* Rosacea
* Schizophrenia
* Sciatica
* Scoliosis
* Seasonal Affective Disorder (SAD)
* Sinusitis
* Skin Cancer
* Sleep Apnea
* Strokes
* Tendonitis
* Tonsillitis
* Tuberculosis
* Urinary Tract Infection (UTI)
* Varicose Veins
* Vitiligo
* Yeast Infection (Candidiasis)
* Zika Virus |
Khalil2/newData | ---
license: apache-2.0
---
|
cloudqi/abreviacoes_e_girias_pt_v0 | ---
license: c-uda
---
|
FanChen0116/19100_chat_Self1x_slot_pvi | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 9973
num_examples: 64
- name: validation
num_bytes: 4887
num_examples: 32
- name: test
num_bytes: 570513
num_examples: 3731
download_size: 124672
dataset_size: 585373
---
# Dataset Card for "19100_chat_Self1x_slot_pvi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cwchoi/whisper_medium_j04 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 76845079560
num_examples: 80000
- name: test
num_bytes: 9605646304
num_examples: 10000
- name: valid
num_bytes: 9605657064
num_examples: 10000
download_size: 14667153103
dataset_size: 96056382928
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
Jiahuan/teach_edh_v2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: history
sequence:
sequence: string
splits:
- name: train
num_bytes: 3470370
num_examples: 17422
- name: test
num_bytes: 936925
num_examples: 5552
download_size: 443536
dataset_size: 4407295
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
tyzhu/find_marker_both_sent_train_400_eval_40_in_context | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 3738032
num_examples: 1994
- name: validation
num_bytes: 383715
num_examples: 200
download_size: 833365
dataset_size: 4121747
---
# Dataset Card for "find_marker_both_sent_train_400_eval_40_in_context"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jjz5463/probing_dataset_4.0 | ---
size_categories:
- n<1K
dataset_info:
features:
- name: attributes
struct:
- name: length
dtype: string
- name: point_of_view
dtype: string
- name: sentence_type
dtype: string
- name: tense
dtype: string
- name: topic
dtype: string
- name: voice
dtype: string
- name: feature
dtype: string
- name: positive
dtype: string
- name: negative
dtype: string
splits:
- name: train
num_bytes: 128121
num_examples: 400
download_size: 49497
dataset_size: 128121
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
library_name: datadreamer
tags:
- datadreamer
- datadreamer-0.25.0
- synthetic
- gpt-4
---
# Dataset Card
[Add more information here](https://huggingface.co/datasets/templates/dataset-card-example)
---
This dataset was produced with [DataDreamer 🤖💤](https://datadreamer.dev). The synthetic dataset card can be found [here](datadreamer.json). |
anan-2024/twitter_dataset_1713050494 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 21832
num_examples: 49
download_size: 11397
dataset_size: 21832
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
micsell/hebrew_kan_sentence10000 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: id
dtype: string
- name: language
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1855718249.0
num_examples: 10000
download_size: 1854849117
dataset_size: 1855718249.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_wnli_zero_plural | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 4382
num_examples: 19
- name: test
num_bytes: 18158
num_examples: 63
- name: train
num_bytes: 44668
num_examples: 210
download_size: 30522
dataset_size: 67208
---
# Dataset Card for "MULTI_VALUE_wnli_zero_plural"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mdass/236_images | ---
dataset_info:
features:
- name: logoName
dtype: string
- name: fileName
dtype: string
- name: image
dtype: binary
- name: name
dtype: string
splits:
- name: train
num_bytes: 1799478
num_examples: 100
download_size: 1789315
dataset_size: 1799478
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "236_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost-DPO | ---
pretty_name: Evaluation run of 0-hero/Matter-0.1-7B-boost-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [0-hero/Matter-0.1-7B-boost-DPO](https://huggingface.co/0-hero/Matter-0.1-7B-boost-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T17:52:36.450234](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost-DPO/blob/main/results_2024-03-22T17-52-36.450234.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6196341771509315,\n\
\ \"acc_stderr\": 0.03280520004973222,\n \"acc_norm\": 0.6228652576735157,\n\
\ \"acc_norm_stderr\": 0.033458484779768545,\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6029363233989143,\n\
\ \"mc2_stderr\": 0.015526892800807802\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.01413117676013117,\n\
\ \"acc_norm\": 0.6501706484641638,\n \"acc_norm_stderr\": 0.013936809212158298\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6459868552081258,\n\
\ \"acc_stderr\": 0.0047723583951304474,\n \"acc_norm\": 0.8308105954989046,\n\
\ \"acc_norm_stderr\": 0.0037415289563158434\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646775,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646775\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596437,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596437\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n\
\ \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066465,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066465\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.0165952597103993,\n \"acc_norm\"\
: 0.8165137614678899,\n \"acc_norm_stderr\": 0.0165952597103993\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n\
\ \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n\
\ \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n\
\ \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560406,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560406\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n\
\ \"acc_stderr\": 0.014317653708594209,\n \"acc_norm\": 0.7994891443167306,\n\
\ \"acc_norm_stderr\": 0.014317653708594209\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.02519018132760841,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.02519018132760841\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n\
\ \"acc_stderr\": 0.016598022120580425,\n \"acc_norm\": 0.43910614525139663,\n\
\ \"acc_norm_stderr\": 0.016598022120580425\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n\
\ \"acc_stderr\": 0.012689708167787682,\n \"acc_norm\": 0.4439374185136897,\n\
\ \"acc_norm_stderr\": 0.012689708167787682\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6029363233989143,\n\
\ \"mc2_stderr\": 0.015526892800807802\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.01206892327890819\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5003790750568613,\n \
\ \"acc_stderr\": 0.013772480761626175\n }\n}\n```"
repo_url: https://huggingface.co/0-hero/Matter-0.1-7B-boost-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|arc:challenge|25_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|gsm8k|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hellaswag|10_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-52-36.450234.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T17-52-36.450234.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- '**/details_harness|winogrande|5_2024-03-22T17-52-36.450234.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T17-52-36.450234.parquet'
- config_name: results
data_files:
- split: 2024_03_22T17_52_36.450234
path:
- results_2024-03-22T17-52-36.450234.parquet
- split: latest
path:
- results_2024-03-22T17-52-36.450234.parquet
---
# Dataset Card for Evaluation run of 0-hero/Matter-0.1-7B-boost-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [0-hero/Matter-0.1-7B-boost-DPO](https://huggingface.co/0-hero/Matter-0.1-7B-boost-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T17:52:36.450234](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost-DPO/blob/main/results_2024-03-22T17-52-36.450234.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6196341771509315,
"acc_stderr": 0.03280520004973222,
"acc_norm": 0.6228652576735157,
"acc_norm_stderr": 0.033458484779768545,
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6029363233989143,
"mc2_stderr": 0.015526892800807802
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.01413117676013117,
"acc_norm": 0.6501706484641638,
"acc_norm_stderr": 0.013936809212158298
},
"harness|hellaswag|10": {
"acc": 0.6459868552081258,
"acc_stderr": 0.0047723583951304474,
"acc_norm": 0.8308105954989046,
"acc_norm_stderr": 0.0037415289563158434
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646775,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646775
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596437,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6230769230769231,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.6230769230769231,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066465,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066465
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.0165952597103993,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.0165952597103993
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467618,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467618
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560406,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560406
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.014317653708594209,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.014317653708594209
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.02519018132760841,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.02519018132760841
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.016598022120580425,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.016598022120580425
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4439374185136897,
"acc_stderr": 0.012689708167787682,
"acc_norm": 0.4439374185136897,
"acc_norm_stderr": 0.012689708167787682
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675606,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6029363233989143,
"mc2_stderr": 0.015526892800807802
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.01206892327890819
},
"harness|gsm8k|5": {
"acc": 0.5003790750568613,
"acc_stderr": 0.013772480761626175
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
svjack/bloom-dialogue-generate-ds-zh | ---
dataset_info:
features:
- name: question
dtype: string
- name: dialogue_text
dtype: string
- name: dialogue
sequence: string
- name: repo
dtype: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 98021681
num_examples: 24297
download_size: 101459282
dataset_size: 98021681
---
# Dataset Card for "bloom-dialogue-generate-ds-zh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LexiconShiftInnovations/SinhalaDentalQnA | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 824274
num_examples: 499
download_size: 325010
dataset_size: 824274
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- si
tags:
- sinhaladataset
- sinhalaqna
- sinhalaquestionanswering
- medical
- dental
- dentalqna
- medicalqna
size_categories:
- n<1K
--- |
Zen1t/projects-dataset | ---
license: apache-2.0
---
|
CyberHarem/kuma_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kuma/球磨/球磨 (Kantai Collection)
This is the dataset of kuma/球磨/球磨 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `brown_hair, long_hair, ahoge, brown_eyes, huge_ahoge, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 509.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuma_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 336.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuma_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1223 | 731.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuma_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 474.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuma_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1223 | 964.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuma_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kuma_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 27 |  |  |  |  |  | 1girl, serafuku, short_sleeves, solo, sailor_collar, simple_background, white_background, looking_at_viewer, red_neckerchief, shirt, white_shorts, smile, open_mouth, blush, navel, midriff, cowboy_shot |
| 1 | 11 |  |  |  |  |  | 1girl, serafuku, solo, shorts, open_mouth, looking_at_viewer, blush, machinery, salute, smile, sitting |
| 2 | 22 |  |  |  |  |  | 1girl, serafuku, short_sleeves, solo, neck_ribbon, red_ribbon, pleated_skirt, simple_background, open_mouth, sailor_shirt, white_background, white_skirt, belt, green_sailor_collar, looking_at_viewer, navel, smile, white_shirt, cowboy_shot |
| 3 | 7 |  |  |  |  |  | 1girl, blush, long_sleeves, pleated_skirt, serafuku, looking_at_viewer, solo, black_pantyhose, cardigan, green_sailor_collar, open_mouth, red_neckerchief, simple_background, green_skirt, sweater, white_background, :d, cowboy_shot |
| 4 | 7 |  |  |  |  |  | 1girl, black_dress, looking_at_viewer, polka_dot_dress, solo, teddy_bear, white_background, blush, brown_jacket, simple_background, smile, alternate_costume, hooded_jacket, open_mouth, black_footwear, boots, full_body, heart_ahoge, object_hug, twitter_username |
| 5 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, navel, simple_background, white_background, blush, collarbone, smile, cowboy_shot, small_breasts, frilled_bikini, medium_breasts, side-tie_bikini_bottom |
| 6 | 25 |  |  |  |  |  | 1girl, solo, floral_print, looking_at_viewer, yukata, obi, blush, hair_bow, open_mouth, smile, white_background, wide_sleeves, alternate_costume, simple_background |
| 7 | 7 |  |  |  |  |  | 1girl, enmaided, frilled_apron, looking_at_viewer, smile, solo, white_apron, blush, heart, long_sleeves, maid_apron, maid_headdress, ribbon, bangs, bow, open_mouth, simple_background, very_long_hair, white_background, black_dress, collared_shirt, hair_bun, skirt, yellow_shirt |
| 8 | 19 |  |  |  |  |  | 1girl, santa_costume, solo, christmas, open_mouth, santa_hat, looking_at_viewer, smile, blush, red_gloves, red_dress, alternate_costume, bell, red_capelet, sack, white_thighhighs, fur-trimmed_capelet, fur-trimmed_dress, red_headwear, white_background, bangs, boots, simple_background, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | serafuku | short_sleeves | solo | sailor_collar | simple_background | white_background | looking_at_viewer | red_neckerchief | shirt | white_shorts | smile | open_mouth | blush | navel | midriff | cowboy_shot | shorts | machinery | salute | sitting | neck_ribbon | red_ribbon | pleated_skirt | sailor_shirt | white_skirt | belt | green_sailor_collar | white_shirt | long_sleeves | black_pantyhose | cardigan | green_skirt | sweater | :d | black_dress | polka_dot_dress | teddy_bear | brown_jacket | alternate_costume | hooded_jacket | black_footwear | boots | full_body | heart_ahoge | object_hug | twitter_username | collarbone | small_breasts | frilled_bikini | medium_breasts | side-tie_bikini_bottom | floral_print | yukata | obi | hair_bow | wide_sleeves | enmaided | frilled_apron | white_apron | heart | maid_apron | maid_headdress | ribbon | bangs | bow | very_long_hair | collared_shirt | hair_bun | skirt | yellow_shirt | santa_costume | christmas | santa_hat | red_gloves | red_dress | bell | red_capelet | sack | white_thighhighs | fur-trimmed_capelet | fur-trimmed_dress | red_headwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:----------------|:-------|:----------------|:--------------------|:-------------------|:--------------------|:------------------|:--------|:---------------|:--------|:-------------|:--------|:--------|:----------|:--------------|:---------|:------------|:---------|:----------|:--------------|:-------------|:----------------|:---------------|:--------------|:-------|:----------------------|:--------------|:---------------|:------------------|:-----------|:--------------|:----------|:-----|:--------------|:------------------|:-------------|:---------------|:--------------------|:----------------|:-----------------|:--------|:------------|:--------------|:-------------|:-------------------|:-------------|:----------------|:-----------------|:-----------------|:-------------------------|:---------------|:---------|:------|:-----------|:---------------|:-----------|:----------------|:--------------|:--------|:-------------|:-----------------|:---------|:--------|:------|:-----------------|:-----------------|:-----------|:--------|:---------------|:----------------|:------------|:------------|:-------------|:------------|:-------|:--------------|:-------|:-------------------|:----------------------|:--------------------|:---------------|
| 0 | 27 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | | X | | | | X | | | | X | X | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 22 |  |  |  |  |  | X | X | X | X | | X | X | X | | | | X | X | | X | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | X | | X | X | X | X | | | | X | X | | | X | | | | | | | X | | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | | X | X | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | | | X | | X | X | X | | | | X | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 25 |  |  |  |  |  | X | | | X | | X | X | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | | X | | X | X | X | | | | X | X | X | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 8 | 19 |  |  |  |  |  | X | | | X | | X | X | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
diplomado2023/calzados2 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v5 | ---
pretty_name: Evaluation run of yeontaek/llama-2-13B-ensemble-v5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-13B-ensemble-v5](https://huggingface.co/yeontaek/llama-2-13B-ensemble-v5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v5\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T09:17:14.183323](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v5/blob/main/results_2023-08-29T09%3A17%3A14.183323.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5953117661059801,\n \"\
acc_stderr\": 0.03391896483304526,\n \"acc_norm\": 0.5994365516843435,\n\
\ \"acc_norm_stderr\": 0.033896234769528244,\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.017028707301245203,\n \"mc2\": 0.5327328500103707,\n\
\ \"mc2_stderr\": 0.015551697577870274\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216388,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759084\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6290579565823541,\n\
\ \"acc_stderr\": 0.004820697457420419,\n \"acc_norm\": 0.8306114319856602,\n\
\ \"acc_norm_stderr\": 0.0037432817493736324\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.02475747390275206,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.02475747390275206\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\
\ \"acc_stderr\": 0.026522709674667765,\n \"acc_norm\": 0.6806451612903226,\n\
\ \"acc_norm_stderr\": 0.026522709674667765\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.02475600038213095,\n \
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.02475600038213095\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154336,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154336\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080852,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080852\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7879948914431673,\n\
\ \"acc_stderr\": 0.014616099385833685,\n \"acc_norm\": 0.7879948914431673,\n\
\ \"acc_norm_stderr\": 0.014616099385833685\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584194,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584194\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4849162011173184,\n\
\ \"acc_stderr\": 0.01671489037999606,\n \"acc_norm\": 0.4849162011173184,\n\
\ \"acc_norm_stderr\": 0.01671489037999606\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045708,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045708\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6029411764705882,\n \"acc_stderr\": 0.019794488900024117,\n \
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.019794488900024117\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.02970528405677244,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.02970528405677244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.017028707301245203,\n \"mc2\": 0.5327328500103707,\n\
\ \"mc2_stderr\": 0.015551697577870274\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-13B-ensemble-v5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|arc:challenge|25_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hellaswag|10_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T09:17:14.183323.parquet'
- config_name: results
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- results_2023-08-29T09:17:14.183323.parquet
- split: latest
path:
- results_2023-08-29T09:17:14.183323.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-13B-ensemble-v5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-13B-ensemble-v5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-13B-ensemble-v5](https://huggingface.co/yeontaek/llama-2-13B-ensemble-v5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v5",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T09:17:14.183323](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v5/blob/main/results_2023-08-29T09%3A17%3A14.183323.json):
```python
{
"all": {
"acc": 0.5953117661059801,
"acc_stderr": 0.03391896483304526,
"acc_norm": 0.5994365516843435,
"acc_norm_stderr": 0.033896234769528244,
"mc1": 0.3843329253365973,
"mc1_stderr": 0.017028707301245203,
"mc2": 0.5327328500103707,
"mc2_stderr": 0.015551697577870274
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216388,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759084
},
"harness|hellaswag|10": {
"acc": 0.6290579565823541,
"acc_stderr": 0.004820697457420419,
"acc_norm": 0.8306114319856602,
"acc_norm_stderr": 0.0037432817493736324
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.02475747390275206,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.02475747390275206
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667765,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667765
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.02475600038213095,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.02475600038213095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154336,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080852,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080852
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.045723723587374296,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.045723723587374296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175372,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7879948914431673,
"acc_stderr": 0.014616099385833685,
"acc_norm": 0.7879948914431673,
"acc_norm_stderr": 0.014616099385833685
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584194,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584194
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4849162011173184,
"acc_stderr": 0.01671489037999606,
"acc_norm": 0.4849162011173184,
"acc_norm_stderr": 0.01671489037999606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045708,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045708
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.019794488900024117,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.019794488900024117
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.02970528405677244,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.02970528405677244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827217,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827217
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3843329253365973,
"mc1_stderr": 0.017028707301245203,
"mc2": 0.5327328500103707,
"mc2_stderr": 0.015551697577870274
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_FredrikBL__test-dare | ---
pretty_name: Evaluation run of FredrikBL/test-dare
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FredrikBL/test-dare](https://huggingface.co/FredrikBL/test-dare) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FredrikBL__test-dare\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T21:19:13.154293](https://huggingface.co/datasets/open-llm-leaderboard/details_FredrikBL__test-dare/blob/main/results_2024-03-29T21-19-13.154293.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6460331293814567,\n\
\ \"acc_stderr\": 0.03224922642756375,\n \"acc_norm\": 0.6478142381132488,\n\
\ \"acc_norm_stderr\": 0.032903872728238165,\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.01680186046667715,\n \"mc2\": 0.5268971269435854,\n\
\ \"mc2_stderr\": 0.015057816486907058\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.014285898292938169,\n\
\ \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.01397545412275656\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6526588329018124,\n\
\ \"acc_stderr\": 0.004751522127418455,\n \"acc_norm\": 0.8487353116908982,\n\
\ \"acc_norm_stderr\": 0.003575744098779938\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033463,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033463\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010347,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010347\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899126,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899126\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3486033519553073,\n\
\ \"acc_stderr\": 0.015937484656687033,\n \"acc_norm\": 0.3486033519553073,\n\
\ \"acc_norm_stderr\": 0.015937484656687033\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667885,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667885\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532067,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.01680186046667715,\n \"mc2\": 0.5268971269435854,\n\
\ \"mc2_stderr\": 0.015057816486907058\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242912\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \
\ \"acc_stderr\": 0.01342838248127422\n }\n}\n```"
repo_url: https://huggingface.co/FredrikBL/test-dare
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|winogrande|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T21-19-13.154293.parquet'
- config_name: results
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- results_2024-03-29T21-19-13.154293.parquet
- split: latest
path:
- results_2024-03-29T21-19-13.154293.parquet
---
# Dataset Card for Evaluation run of FredrikBL/test-dare
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FredrikBL/test-dare](https://huggingface.co/FredrikBL/test-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FredrikBL__test-dare",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T21:19:13.154293](https://huggingface.co/datasets/open-llm-leaderboard/details_FredrikBL__test-dare/blob/main/results_2024-03-29T21-19-13.154293.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6460331293814567,
"acc_stderr": 0.03224922642756375,
"acc_norm": 0.6478142381132488,
"acc_norm_stderr": 0.032903872728238165,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.01680186046667715,
"mc2": 0.5268971269435854,
"mc2_stderr": 0.015057816486907058
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.014285898292938169,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.01397545412275656
},
"harness|hellaswag|10": {
"acc": 0.6526588329018124,
"acc_stderr": 0.004751522127418455,
"acc_norm": 0.8487353116908982,
"acc_norm_stderr": 0.003575744098779938
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033463,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033463
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010347,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010347
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899126,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899126
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3486033519553073,
"acc_stderr": 0.015937484656687033,
"acc_norm": 0.3486033519553073,
"acc_norm_stderr": 0.015937484656687033
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667885,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667885
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532067,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.01680186046667715,
"mc2": 0.5268971269435854,
"mc2_stderr": 0.015057816486907058
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242912
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.01342838248127422
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
EddyGiusepe/dataset-portuguese-aira-v2-Gemma-format | ---
license: apache-2.0
task_categories:
- question-answering
language:
- pt
pretty_name: dataset-portuguese-aira-v2-Gemma-format
tags:
- alignment
- instruction
- chat
size_categories:
- 10K<n<100K
---
<h1 align="center"><font color="red">Dataset Aira para o formato do Modelo Gemma </font></h1>
# <font color="gree">Resumo do Dataset</font>
Este conjunto de dados contém uma coleção de conversas individuais entre um assistente e um usuário.
As conversas foram geradas pelas interações do usuário com modelos já ajustados (`ChatGPT`, `LLama 2`, `Open-Assistant`, etc).
O conjunto de dados está disponível em português (tem a versão em Inglês que ainda não tratei). Mas você pode baixar do
repositório de [Nicholas Kluge Corrêa](https://huggingface.co/datasets/nicholasKluge/instruct-aira-dataset-v2) tanto a versão em Português e
a versão em Inglês.
# <font color="gree">Informações para citação</font>
```latex
@misc{nicholas22aira,
doi = {10.5281/zenodo.6989727},
url = {https://github.com/Nkluge-correa/Aira},
author = {Nicholas Kluge Corrêa},
title = {Aira},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
}
```
Thanks God 🤗!
|
Multimodal-Fatima/Food101_train_embeddings | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
- name: vision_embeddings
sequence: float32
splits:
- name: openai_clip_vit_large_patch14
num_bytes: 4075664187.0
num_examples: 75750
download_size: 4082066204
dataset_size: 4075664187.0
---
# Dataset Card for "Food101_train_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_1.0_seed_1_t_1.0 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43723341
num_examples: 18928
- name: epoch_1
num_bytes: 44402988
num_examples: 18928
- name: epoch_2
num_bytes: 44484278
num_examples: 18928
- name: epoch_3
num_bytes: 44534921
num_examples: 18928
- name: epoch_4
num_bytes: 44575476
num_examples: 18928
- name: epoch_5
num_bytes: 44571078
num_examples: 18928
- name: epoch_6
num_bytes: 44563832
num_examples: 18928
- name: epoch_7
num_bytes: 44549846
num_examples: 18928
- name: epoch_8
num_bytes: 44549217
num_examples: 18928
- name: epoch_9
num_bytes: 44538911
num_examples: 18928
- name: epoch_10
num_bytes: 44530903
num_examples: 18928
- name: epoch_11
num_bytes: 44533606
num_examples: 18928
- name: epoch_12
num_bytes: 44530661
num_examples: 18928
- name: epoch_13
num_bytes: 44528582
num_examples: 18928
- name: epoch_14
num_bytes: 44526423
num_examples: 18928
- name: epoch_15
num_bytes: 44529476
num_examples: 18928
- name: epoch_16
num_bytes: 44530427
num_examples: 18928
- name: epoch_17
num_bytes: 44526259
num_examples: 18928
- name: epoch_18
num_bytes: 44526277
num_examples: 18928
- name: epoch_19
num_bytes: 44528567
num_examples: 18928
- name: epoch_20
num_bytes: 44526263
num_examples: 18928
- name: epoch_21
num_bytes: 44525909
num_examples: 18928
- name: epoch_22
num_bytes: 44524434
num_examples: 18928
- name: epoch_23
num_bytes: 44523801
num_examples: 18928
- name: epoch_24
num_bytes: 44526188
num_examples: 18928
- name: epoch_25
num_bytes: 44527114
num_examples: 18928
- name: epoch_26
num_bytes: 44529386
num_examples: 18928
- name: epoch_27
num_bytes: 44528020
num_examples: 18928
- name: epoch_28
num_bytes: 44526830
num_examples: 18928
- name: epoch_29
num_bytes: 44529769
num_examples: 18928
download_size: 700608707
dataset_size: 1335052783
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
autoevaluate/autoeval-eval-phpthinh__exampletx-constructive-7f6ba0-1708559812 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/exampletx
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-560m
metrics: []
dataset_name: phpthinh/exampletx
dataset_config: constructive
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-560m
* Dataset: phpthinh/exampletx
* Config: constructive
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
euswam/swam | ---
license: artistic-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.