datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
CVasNLPExperiments/FGVC_Aircraft_test_google_flan_t5_xxl_mode_C_A_T_ns_3333 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 1096294
num_examples: 3333
- name: fewshot_1_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 2101864
num_examples: 3333
- name: fewshot_3_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 4112966
num_examples: 3333
- name: fewshot_5_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 6122793
num_examples: 3333
download_size: 2520731
dataset_size: 13433917
---
# Dataset Card for "FGVC_Aircraft_test_google_flan_t5_xxl_mode_C_A_T_ns_3333"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_migtissera__Tess-70B-v1.6 | ---
pretty_name: Evaluation run of migtissera/Tess-70B-v1.6
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Tess-70B-v1.6](https://huggingface.co/migtissera/Tess-70B-v1.6) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-70B-v1.6\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-23T18:22:23.602404](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-70B-v1.6/blob/main/results_2024-03-23T18-22-23.602404.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7463705274541453,\n\
\ \"acc_stderr\": 0.028918835509549337,\n \"acc_norm\": 0.7490715932436255,\n\
\ \"acc_norm_stderr\": 0.029482890504294132,\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.017433490102538772,\n \"mc2\": 0.6379505279993883,\n\
\ \"mc2_stderr\": 0.014969097292080345\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.674061433447099,\n \"acc_stderr\": 0.013697432466693237,\n\
\ \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274767\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6953794064927306,\n\
\ \"acc_stderr\": 0.004593059367676213,\n \"acc_norm\": 0.8706432981477793,\n\
\ \"acc_norm_stderr\": 0.0033490845685472588\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\
\ \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.6814814814814815,\n\
\ \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.030167533468632723,\n\
\ \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.030167533468632723\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.02628055093284808,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.02628055093284808\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n\
\ \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n\
\ \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7361702127659574,\n \"acc_stderr\": 0.028809989854102963,\n\
\ \"acc_norm\": 0.7361702127659574,\n \"acc_norm_stderr\": 0.028809989854102963\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6896551724137931,\n \"acc_stderr\": 0.03855289616378949,\n\
\ \"acc_norm\": 0.6896551724137931,\n \"acc_norm_stderr\": 0.03855289616378949\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5476190476190477,\n \"acc_stderr\": 0.025634258115554965,\n \"\
acc_norm\": 0.5476190476190477,\n \"acc_norm_stderr\": 0.025634258115554965\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.867741935483871,\n\
\ \"acc_stderr\": 0.01927201543484647,\n \"acc_norm\": 0.867741935483871,\n\
\ \"acc_norm_stderr\": 0.01927201543484647\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5812807881773399,\n \"acc_stderr\": 0.03471192860518468,\n\
\ \"acc_norm\": 0.5812807881773399,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\"\
: 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607558,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607558\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7769230769230769,\n \"acc_stderr\": 0.021107730127244,\n \
\ \"acc_norm\": 0.7769230769230769,\n \"acc_norm_stderr\": 0.021107730127244\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465718,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465718\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02300545944667394,\n \
\ \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02300545944667394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5231788079470199,\n \"acc_stderr\": 0.04078093859163085,\n \"\
acc_norm\": 0.5231788079470199,\n \"acc_norm_stderr\": 0.04078093859163085\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9284403669724771,\n \"acc_stderr\": 0.01105125524781546,\n \"\
acc_norm\": 0.9284403669724771,\n \"acc_norm_stderr\": 0.01105125524781546\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6388888888888888,\n \"acc_stderr\": 0.032757734861009996,\n \"\
acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.032757734861009996\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9019607843137255,\n \"acc_stderr\": 0.020871118455552107,\n \"\
acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.020871118455552107\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.01926932302564026,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.01926932302564026\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8295964125560538,\n\
\ \"acc_stderr\": 0.025234593447136185,\n \"acc_norm\": 0.8295964125560538,\n\
\ \"acc_norm_stderr\": 0.025234593447136185\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744631,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744631\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9008264462809917,\n \"acc_stderr\": 0.02728524631275896,\n \"\
acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.02728524631275896\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n\
\ \"acc_stderr\": 0.03247224389917948,\n \"acc_norm\": 0.8703703703703703,\n\
\ \"acc_norm_stderr\": 0.03247224389917948\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553855,\n\
\ \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553855\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6696428571428571,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.6696428571428571,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.032881802788086285,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.032881802788086285\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.01604626163167314,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.01604626163167314\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8991060025542784,\n\
\ \"acc_stderr\": 0.01077047201488672,\n \"acc_norm\": 0.8991060025542784,\n\
\ \"acc_norm_stderr\": 0.01077047201488672\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8179190751445087,\n \"acc_stderr\": 0.02077676110251297,\n\
\ \"acc_norm\": 0.8179190751445087,\n \"acc_norm_stderr\": 0.02077676110251297\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7318435754189944,\n\
\ \"acc_stderr\": 0.014816119635317008,\n \"acc_norm\": 0.7318435754189944,\n\
\ \"acc_norm_stderr\": 0.014816119635317008\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123138,\n\
\ \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123138\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n\
\ \"acc_stderr\": 0.021670058885510785,\n \"acc_norm\": 0.8231511254019293,\n\
\ \"acc_norm_stderr\": 0.021670058885510785\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.019766459563597252,\n\
\ \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.019766459563597252\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.029494827600144363,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.029494827600144363\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5795306388526728,\n\
\ \"acc_stderr\": 0.012607654553832705,\n \"acc_norm\": 0.5795306388526728,\n\
\ \"acc_norm_stderr\": 0.012607654553832705\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8014705882352942,\n \"acc_stderr\": 0.024231013370541087,\n\
\ \"acc_norm\": 0.8014705882352942,\n \"acc_norm_stderr\": 0.024231013370541087\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273344,\n \
\ \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273344\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n\
\ \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9104477611940298,\n\
\ \"acc_stderr\": 0.02019067053502791,\n \"acc_norm\": 0.9104477611940298,\n\
\ \"acc_norm_stderr\": 0.02019067053502791\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.017433490102538772,\n \"mc2\": 0.6379505279993883,\n\
\ \"mc2_stderr\": 0.014969097292080345\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.010309209498187474\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7081122062168309,\n \
\ \"acc_stderr\": 0.012522795894420869\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Tess-70B-v1.6
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|arc:challenge|25_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|gsm8k|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hellaswag|10_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T18-22-23.602404.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-23T18-22-23.602404.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- '**/details_harness|winogrande|5_2024-03-23T18-22-23.602404.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-23T18-22-23.602404.parquet'
- config_name: results
data_files:
- split: 2024_03_23T18_22_23.602404
path:
- results_2024-03-23T18-22-23.602404.parquet
- split: latest
path:
- results_2024-03-23T18-22-23.602404.parquet
---
# Dataset Card for Evaluation run of migtissera/Tess-70B-v1.6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [migtissera/Tess-70B-v1.6](https://huggingface.co/migtissera/Tess-70B-v1.6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-70B-v1.6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-23T18:22:23.602404](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-70B-v1.6/blob/main/results_2024-03-23T18-22-23.602404.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7463705274541453,
"acc_stderr": 0.028918835509549337,
"acc_norm": 0.7490715932436255,
"acc_norm_stderr": 0.029482890504294132,
"mc1": 0.45532435740514077,
"mc1_stderr": 0.017433490102538772,
"mc2": 0.6379505279993883,
"mc2_stderr": 0.014969097292080345
},
"harness|arc:challenge|25": {
"acc": 0.674061433447099,
"acc_stderr": 0.013697432466693237,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274767
},
"harness|hellaswag|10": {
"acc": 0.6953794064927306,
"acc_stderr": 0.004593059367676213,
"acc_norm": 0.8706432981477793,
"acc_norm_stderr": 0.0033490845685472588
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8355263157894737,
"acc_stderr": 0.030167533468632723,
"acc_norm": 0.8355263157894737,
"acc_norm_stderr": 0.030167533468632723
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02628055093284808,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02628055093284808
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7361702127659574,
"acc_stderr": 0.028809989854102963,
"acc_norm": 0.7361702127659574,
"acc_norm_stderr": 0.028809989854102963
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.03855289616378949,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.03855289616378949
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.025634258115554965,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.025634258115554965
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.867741935483871,
"acc_stderr": 0.01927201543484647,
"acc_norm": 0.867741935483871,
"acc_norm_stderr": 0.01927201543484647
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5812807881773399,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.5812807881773399,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066584,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066584
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880232,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880232
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607558,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607558
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7769230769230769,
"acc_stderr": 0.021107730127244,
"acc_norm": 0.7769230769230769,
"acc_norm_stderr": 0.021107730127244
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465718,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465718
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02300545944667394,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02300545944667394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5231788079470199,
"acc_stderr": 0.04078093859163085,
"acc_norm": 0.5231788079470199,
"acc_norm_stderr": 0.04078093859163085
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9284403669724771,
"acc_stderr": 0.01105125524781546,
"acc_norm": 0.9284403669724771,
"acc_norm_stderr": 0.01105125524781546
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.020871118455552107,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.020871118455552107
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.01926932302564026,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.01926932302564026
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8295964125560538,
"acc_stderr": 0.025234593447136185,
"acc_norm": 0.8295964125560538,
"acc_norm_stderr": 0.025234593447136185
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744631,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744631
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.02728524631275896,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.02728524631275896
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.03247224389917948,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.03247224389917948
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553855,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553855
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6696428571428571,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.6696428571428571,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.032881802788086285,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.032881802788086285
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.01604626163167314,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.01604626163167314
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8991060025542784,
"acc_stderr": 0.01077047201488672,
"acc_norm": 0.8991060025542784,
"acc_norm_stderr": 0.01077047201488672
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8179190751445087,
"acc_stderr": 0.02077676110251297,
"acc_norm": 0.8179190751445087,
"acc_norm_stderr": 0.02077676110251297
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7318435754189944,
"acc_stderr": 0.014816119635317008,
"acc_norm": 0.7318435754189944,
"acc_norm_stderr": 0.014816119635317008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123138,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.021670058885510785,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.021670058885510785
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.019766459563597252,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.019766459563597252
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.029494827600144363,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.029494827600144363
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5795306388526728,
"acc_stderr": 0.012607654553832705,
"acc_norm": 0.5795306388526728,
"acc_norm_stderr": 0.012607654553832705
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8014705882352942,
"acc_stderr": 0.024231013370541087,
"acc_norm": 0.8014705882352942,
"acc_norm_stderr": 0.024231013370541087
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273344,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9104477611940298,
"acc_stderr": 0.02019067053502791,
"acc_norm": 0.9104477611940298,
"acc_norm_stderr": 0.02019067053502791
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45532435740514077,
"mc1_stderr": 0.017433490102538772,
"mc2": 0.6379505279993883,
"mc2_stderr": 0.014969097292080345
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.010309209498187474
},
"harness|gsm8k|5": {
"acc": 0.7081122062168309,
"acc_stderr": 0.012522795894420869
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Radiantloom__radintloom-mistral-7b-fusion-dpo | ---
pretty_name: Evaluation run of Radiantloom/radintloom-mistral-7b-fusion-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Radiantloom/radintloom-mistral-7b-fusion-dpo](https://huggingface.co/Radiantloom/radintloom-mistral-7b-fusion-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Radiantloom__radintloom-mistral-7b-fusion-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-20T15:46:38.260754](https://huggingface.co/datasets/open-llm-leaderboard/details_Radiantloom__radintloom-mistral-7b-fusion-dpo/blob/main/results_2024-02-20T15-46-38.260754.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6257828467992991,\n\
\ \"acc_stderr\": 0.032145124565345747,\n \"acc_norm\": 0.6375843550222254,\n\
\ \"acc_norm_stderr\": 0.032991322495938086,\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.5113860397966288,\n\
\ \"mc2_stderr\": 0.015291606116990751\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.014405618279436176,\n\
\ \"acc_norm\": 0.6313993174061433,\n \"acc_norm_stderr\": 0.014097810678042201\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6365265883290181,\n\
\ \"acc_stderr\": 0.004800164434233245,\n \"acc_norm\": 0.8367855008962358,\n\
\ \"acc_norm_stderr\": 0.0036880598312390225\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474894,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474894\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.02302589961718871,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.02302589961718871\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188703,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188703\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461773,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461773\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.0413311944024384,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.0413311944024384\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.02220930907316562,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.02220930907316562\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.01396439376989914,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.01396439376989914\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3217877094972067,\n\
\ \"acc_stderr\": 0.015624236160792573,\n \"acc_norm\": 0.3217877094972067,\n\
\ \"acc_norm_stderr\": 0.015624236160792573\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.02633661346904663,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.02633661346904663\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n\
\ \"acc_stderr\": 0.012755368722863931,\n \"acc_norm\": 0.4758800521512386,\n\
\ \"acc_norm_stderr\": 0.012755368722863931\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6486928104575164,\n \"acc_stderr\": 0.019312676065786558,\n \
\ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.019312676065786558\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.5113860397966288,\n\
\ \"mc2_stderr\": 0.015291606116990751\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.011251958281205086\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401501776\n }\n}\n```"
repo_url: https://huggingface.co/Radiantloom/radintloom-mistral-7b-fusion-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|arc:challenge|25_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|gsm8k|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hellaswag|10_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T15-46-38.260754.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T15-46-38.260754.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- '**/details_harness|winogrande|5_2024-02-20T15-46-38.260754.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-20T15-46-38.260754.parquet'
- config_name: results
data_files:
- split: 2024_02_20T15_46_38.260754
path:
- results_2024-02-20T15-46-38.260754.parquet
- split: latest
path:
- results_2024-02-20T15-46-38.260754.parquet
---
# Dataset Card for Evaluation run of Radiantloom/radintloom-mistral-7b-fusion-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Radiantloom/radintloom-mistral-7b-fusion-dpo](https://huggingface.co/Radiantloom/radintloom-mistral-7b-fusion-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Radiantloom__radintloom-mistral-7b-fusion-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-20T15:46:38.260754](https://huggingface.co/datasets/open-llm-leaderboard/details_Radiantloom__radintloom-mistral-7b-fusion-dpo/blob/main/results_2024-02-20T15-46-38.260754.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6257828467992991,
"acc_stderr": 0.032145124565345747,
"acc_norm": 0.6375843550222254,
"acc_norm_stderr": 0.032991322495938086,
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.5113860397966288,
"mc2_stderr": 0.015291606116990751
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.014405618279436176,
"acc_norm": 0.6313993174061433,
"acc_norm_stderr": 0.014097810678042201
},
"harness|hellaswag|10": {
"acc": 0.6365265883290181,
"acc_stderr": 0.004800164434233245,
"acc_norm": 0.8367855008962358,
"acc_norm_stderr": 0.0036880598312390225
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474894,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474894
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718871,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718871
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188703,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188703
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461773,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461773
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.0413311944024384,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.0413311944024384
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316562,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316562
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.01396439376989914,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.01396439376989914
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3217877094972067,
"acc_stderr": 0.015624236160792573,
"acc_norm": 0.3217877094972067,
"acc_norm_stderr": 0.015624236160792573
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.02633661346904663,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.02633661346904663
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863931,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863931
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.019312676065786558,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.019312676065786558
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.5113860397966288,
"mc2_stderr": 0.015291606116990751
},
"harness|winogrande|5": {
"acc": 0.7995264404104183,
"acc_stderr": 0.011251958281205086
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501776
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
muyoungko/koreanvoice | ---
license: apache-2.0
---
|
Csplk/Testascii | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
tags:
- art
--- |
truongpdd/laion-2b-vietnamese-subset | ---
dataset_info:
features:
- name: SAMPLE_ID
dtype: int64
- name: URL
dtype: string
- name: TEXT
dtype: string
- name: HEIGHT
dtype: int32
- name: WIDTH
dtype: int32
- name: LICENSE
dtype: string
- name: LANGUAGE
dtype: string
- name: NSFW
dtype: string
- name: similarity
dtype: float64
splits:
- name: train
num_bytes: 10669843542.009588
num_examples: 48169285
download_size: 7285732213
dataset_size: 10669843542.009588
---
# Dataset Card for "laion-2b-vietnamese-subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-glue-mrpc-e15d1b-14665994 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: natural_language_inference
model: sgugger/glue-mrpc
metrics: []
dataset_name: glue
dataset_config: mrpc
dataset_split: validation
col_mapping:
text1: sentence1
text2: sentence2
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: sgugger/glue-mrpc
* Dataset: glue
* Config: mrpc
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
arieg/bw_spec_cls_80_42 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '94635'
'1': '94638'
'2': '95189'
'3': '95231'
'4': '95248'
'5': '95249'
'6': '95250'
'7': '95251'
'8': '95308'
'9': '95309'
'10': '95310'
'11': '95452'
'12': '95506'
'13': '95564'
'14': '95722'
'15': '95724'
'16': '95725'
'17': '95726'
'18': '95727'
'19': '95908'
'20': '95910'
'21': '95911'
'22': '95912'
'23': '95914'
'24': '95915'
'25': '96166'
'26': '96167'
'27': '96168'
'28': '96169'
'29': '96399'
'30': '96400'
'31': '96401'
'32': '96402'
'33': '96403'
'34': '96408'
'35': '96627'
'36': '96657'
'37': '96675'
'38': '96678'
'39': '96692'
'40': '96693'
'41': '96694'
'42': '96695'
'43': '96696'
'44': '96697'
'45': '96698'
'46': '96699'
'47': '96718'
'48': '96726'
'49': '96728'
'50': '96729'
'51': '96730'
'52': '96731'
'53': '96898'
'54': '96900'
'55': '96901'
'56': '96902'
'57': '96935'
'58': '96936'
'59': '96944'
'60': '96945'
'61': '96946'
'62': '97037'
'63': '97041'
'64': '97043'
'65': '97211'
'66': '97215'
'67': '97216'
'68': '97279'
'69': '97283'
'70': '97285'
'71': '97373'
'72': '97374'
'73': '97393'
'74': '97404'
'75': '97406'
'76': '97407'
'77': '97424'
'78': '97540'
'79': '97542'
splits:
- name: train
num_bytes: 88630908.8
num_examples: 1600
- name: test
num_bytes: 21994535.0
num_examples: 400
download_size: 110458426
dataset_size: 110625443.8
---
# Dataset Card for "bw_spec_cls_80_42"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hycin/spoken-hpg-incident | ---
license: cc-by-nc-2.0
---
|
alvarobartt/evol-instruct-from-ultrafeedback | ---
dataset_info:
features:
- name: source
dtype: string
- name: instruction
dtype: string
- name: models
sequence: string
- name: completions
list:
- name: annotations
struct:
- name: helpfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: honesty
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: instruction_following
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: truthfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: critique
dtype: string
- name: custom_system_prompt
dtype: string
- name: fine-grained_score
dtype: float64
- name: model
dtype: string
- name: overall_score
dtype: float64
- name: principle
dtype: string
- name: response
dtype: string
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
- name: vector
sequence: float32
splits:
- name: train
num_bytes: 169455285
num_examples: 10000
download_size: 82231702
dataset_size: 169455285
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Shefton/TestTitanicdata | ---
license: cc-by-nc-nd-3.0
---
|
DZN222/olui | ---
license: openrail
---
|
yuvalkirstain/PickaPic-ft-pairs | ---
dataset_info:
features:
- name: url_bad
dtype: string
- name: url_good
dtype: string
- name: good_jpg
dtype: binary
- name: caption
dtype: string
- name: user_id
dtype: int64
- name: has_label
dtype: bool
- name: bad_jpg
dtype: binary
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 6044785615
num_examples: 27208
- name: validation
num_bytes: 292430239
num_examples: 1335
- name: test
num_bytes: 318053633
num_examples: 1410
- name: validation_unique
num_bytes: 54146831
num_examples: 250
- name: test_unique
num_bytes: 54986693
num_examples: 250
download_size: 6690503991
dataset_size: 6764403011
---
# Dataset Card for "PickaPic-ft-pairs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KJChen/tech2proc | ---
license: mit
task_categories:
- text2text-generation
language:
- en
tags:
- security
--- |
open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-100K | ---
pretty_name: Evaluation run of lodrick-the-lafted/Hermes-Instruct-7B-100K
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lodrick-the-lafted/Hermes-Instruct-7B-100K](https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-100K)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-100K\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-20T08:08:38.814549](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-100K/blob/main/results_2024-02-20T08-08-38.814549.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6090324026750107,\n\
\ \"acc_stderr\": 0.03306627928563301,\n \"acc_norm\": 0.6132702932279503,\n\
\ \"acc_norm_stderr\": 0.0337325206267565,\n \"mc1\": 0.4614443084455324,\n\
\ \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6362212933287348,\n\
\ \"mc2_stderr\": 0.015296863707374602\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5648464163822525,\n \"acc_stderr\": 0.014487986197186038,\n\
\ \"acc_norm\": 0.6151877133105802,\n \"acc_norm_stderr\": 0.014218371065251105\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6383190599482175,\n\
\ \"acc_stderr\": 0.004795051037917733,\n \"acc_norm\": 0.8284206333399721,\n\
\ \"acc_norm_stderr\": 0.0037624392841951065\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.032555253593403555,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.032555253593403555\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266237,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266237\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067877,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067877\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572274,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397457,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397457\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.02508830145469483,\n \
\ \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.02508830145469483\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.038969819642573754,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.038969819642573754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154343,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260594,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260594\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.02126271940040698,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.02126271940040698\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.025522474632121612,\n\
\ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.025522474632121612\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3005586592178771,\n\
\ \"acc_stderr\": 0.015334566806251154,\n \"acc_norm\": 0.3005586592178771,\n\
\ \"acc_norm_stderr\": 0.015334566806251154\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893937,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893937\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n\
\ \"acc_stderr\": 0.01265903323706725,\n \"acc_norm\": 0.43415906127770537,\n\
\ \"acc_norm_stderr\": 0.01265903323706725\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681404,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681404\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6127450980392157,\n \"acc_stderr\": 0.01970687580408564,\n \
\ \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.01970687580408564\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417468,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417468\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4614443084455324,\n\
\ \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6362212933287348,\n\
\ \"mc2_stderr\": 0.015296863707374602\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4397270659590599,\n \
\ \"acc_stderr\": 0.013672052434471574\n }\n}\n```"
repo_url: https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-100K
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|arc:challenge|25_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|gsm8k|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hellaswag|10_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T08-08-38.814549.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T08-08-38.814549.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- '**/details_harness|winogrande|5_2024-02-20T08-08-38.814549.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-20T08-08-38.814549.parquet'
- config_name: results
data_files:
- split: 2024_02_20T08_08_38.814549
path:
- results_2024-02-20T08-08-38.814549.parquet
- split: latest
path:
- results_2024-02-20T08-08-38.814549.parquet
---
# Dataset Card for Evaluation run of lodrick-the-lafted/Hermes-Instruct-7B-100K
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lodrick-the-lafted/Hermes-Instruct-7B-100K](https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-100K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-100K",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-20T08:08:38.814549](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-100K/blob/main/results_2024-02-20T08-08-38.814549.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6090324026750107,
"acc_stderr": 0.03306627928563301,
"acc_norm": 0.6132702932279503,
"acc_norm_stderr": 0.0337325206267565,
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6362212933287348,
"mc2_stderr": 0.015296863707374602
},
"harness|arc:challenge|25": {
"acc": 0.5648464163822525,
"acc_stderr": 0.014487986197186038,
"acc_norm": 0.6151877133105802,
"acc_norm_stderr": 0.014218371065251105
},
"harness|hellaswag|10": {
"acc": 0.6383190599482175,
"acc_stderr": 0.004795051037917733,
"acc_norm": 0.8284206333399721,
"acc_norm_stderr": 0.0037624392841951065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067877,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067877
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572274,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397457,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397457
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.02508830145469483,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.02508830145469483
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.038969819642573754,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.038969819642573754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154343,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114969,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114969
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260594,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260594
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.02126271940040698,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.02126271940040698
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.025522474632121612,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.025522474632121612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3005586592178771,
"acc_stderr": 0.015334566806251154,
"acc_norm": 0.3005586592178771,
"acc_norm_stderr": 0.015334566806251154
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893937,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893937
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.01265903323706725,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.01265903323706725
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681404,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681404
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.01970687580408564,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.01970687580408564
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417468,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417468
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6362212933287348,
"mc2_stderr": 0.015296863707374602
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.011850040124850508
},
"harness|gsm8k|5": {
"acc": 0.4397270659590599,
"acc_stderr": 0.013672052434471574
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
amin-nejad/idrid-disease-grading | ---
license: cc-by-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': a_noDR
'1': b_mildDR
'2': c_moderateDR
'3': d_severeDR
'4': e_proDR
splits:
- name: train
num_bytes: 166058061
num_examples: 413
- name: test
num_bytes: 46195500
num_examples: 103
download_size: 203477506
dataset_size: 212253561
task_categories:
- image-classification
language:
- en
tags:
- medical
pretty_name: IDRiD Disease Grading
size_categories:
- n<1K
---
# Indian Diabetic Retinopathy Image Dataset (IDRiD)
This dataset is the disease grading portion of the IDRiD.
The original source of the dataset is here: https://ieee-dataport.org/open-access/indian-diabetic-retinopathy-image-dataset-idrid
|
kobe1987/DLLM2TM | ---
license: cc-by-4.0
task_categories:
- token-classification
size_categories:
- 1K<n<10K
---
### Overview
This dataset is for the paper "DISTILLING LARGE LANGUAGE MODELS INTO TINY MODELS FOR NAMED ENTITY RECOGNITION" (https://arxiv.org/abs/2402.09282).
In files directory, there are 7 files. The brief description is as follws:
### Introduction
#### Output_of_LLM.xlsx
We use GPT4 to annotage name entities for CONLL and BBC data. Specificly, we used standard prompting and CoT prompting strategies to do it. The original data, ground true(CONLL only), GPT's tagging result, reasoning precess for CoT are list in this file.
#### experiment_setting_evaluation_result.xlsx
There are 4 sheets in it. The first one is the experiment arrangement, total 190 lines, including the number of distilled and original data of mixing strategies, and performance recorded. The rest are performance of evaluation in phase 2 and 3.
#### Data_for_training_and_evaluating.xlsx
It's the data used to train and evaluate in the paper, including the distilled CONLL data originated from CONLL2003, the CONLL and BBC distilled combination, the original data from CONLL training set and the CONLL test set. THe 4 sheets provide the data bases for training and testing in phase 2 and 3.
#### Some Jupyter Notebooks
Code in the form of jupyter notebook for the paper, including the LLM annotation in phase one, training and evaluating of distilled and original data in phase 2 and 3, and the mixing strategies mentioned in the paper.
#### weight_decay_curves.pdf
The decay curves of w_0(the sampling ratio of distilled data) of different mixing strategies. |
open-llm-leaderboard/details_LordNoah__spin_gpt2_medium_alpaca_e2 | ---
pretty_name: Evaluation run of LordNoah/spin_gpt2_medium_alpaca_e2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LordNoah/spin_gpt2_medium_alpaca_e2](https://huggingface.co/LordNoah/spin_gpt2_medium_alpaca_e2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LordNoah__spin_gpt2_medium_alpaca_e2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T11:33:21.183475](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__spin_gpt2_medium_alpaca_e2/blob/main/results_2024-02-18T11-33-21.183475.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27025146447288184,\n\
\ \"acc_stderr\": 0.031184084314290456,\n \"acc_norm\": 0.27228177218315647,\n\
\ \"acc_norm_stderr\": 0.03200244558578517,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.4151892085266857,\n\
\ \"mc2_stderr\": 0.01437033424307639\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23122866894197952,\n \"acc_stderr\": 0.012320858834772278,\n\
\ \"acc_norm\": 0.28071672354948807,\n \"acc_norm_stderr\": 0.013131238126975583\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3308105954989046,\n\
\ \"acc_stderr\": 0.004695434103958509,\n \"acc_norm\": 0.3988249352718582,\n\
\ \"acc_norm_stderr\": 0.004886559008754979\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n\
\ \"acc_stderr\": 0.03773809990686935,\n \"acc_norm\": 0.2847222222222222,\n\
\ \"acc_norm_stderr\": 0.03773809990686935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n\
\ \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.32413793103448274,\n \"acc_stderr\": 0.03900432069185554,\n\
\ \"acc_norm\": 0.32413793103448274,\n \"acc_norm_stderr\": 0.03900432069185554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948365,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948365\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302054,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302054\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\
\ \"acc_stderr\": 0.02479011845933221,\n \"acc_norm\": 0.25483870967741934,\n\
\ \"acc_norm_stderr\": 0.02479011845933221\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139405,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139405\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3838383838383838,\n \"acc_stderr\": 0.03464881675016338,\n \"\
acc_norm\": 0.3838383838383838,\n \"acc_norm_stderr\": 0.03464881675016338\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32051282051282054,\n \"acc_stderr\": 0.023661296393964273,\n\
\ \"acc_norm\": 0.32051282051282054,\n \"acc_norm_stderr\": 0.023661296393964273\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230186,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230186\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275886,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275886\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"\
acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.03338473403207401,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.03338473403207401\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22549019607843138,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035303,\n \
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035303\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.1210762331838565,\n\
\ \"acc_stderr\": 0.021894174113185737,\n \"acc_norm\": 0.1210762331838565,\n\
\ \"acc_norm_stderr\": 0.021894174113185737\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.1487603305785124,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.1487603305785124,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n\
\ \"acc_stderr\": 0.0356236785009539,\n \"acc_norm\": 0.16964285714285715,\n\
\ \"acc_norm_stderr\": 0.0356236785009539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n\
\ \"acc_stderr\": 0.027778835904935437,\n \"acc_norm\": 0.23504273504273504,\n\
\ \"acc_norm_stderr\": 0.027778835904935437\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21839080459770116,\n\
\ \"acc_stderr\": 0.0147743583199345,\n \"acc_norm\": 0.21839080459770116,\n\
\ \"acc_norm_stderr\": 0.0147743583199345\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28034682080924855,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.28034682080924855,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26140808344198174,\n\
\ \"acc_stderr\": 0.01122252816977131,\n \"acc_norm\": 0.26140808344198174,\n\
\ \"acc_norm_stderr\": 0.01122252816977131\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877743,\n\
\ \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877743\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132228,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132228\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.363265306122449,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.363265306122449,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2289156626506024,\n\
\ \"acc_stderr\": 0.03270745277352477,\n \"acc_norm\": 0.2289156626506024,\n\
\ \"acc_norm_stderr\": 0.03270745277352477\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.4151892085266857,\n\
\ \"mc2_stderr\": 0.01437033424307639\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5367008681925809,\n \"acc_stderr\": 0.014014578458843258\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \
\ \"acc_stderr\": 0.001071779348549268\n }\n}\n```"
repo_url: https://huggingface.co/LordNoah/spin_gpt2_medium_alpaca_e2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|arc:challenge|25_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|gsm8k|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hellaswag|10_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T11-33-21.183475.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T11-33-21.183475.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- '**/details_harness|winogrande|5_2024-02-18T11-33-21.183475.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T11-33-21.183475.parquet'
- config_name: results
data_files:
- split: 2024_02_18T11_33_21.183475
path:
- results_2024-02-18T11-33-21.183475.parquet
- split: latest
path:
- results_2024-02-18T11-33-21.183475.parquet
---
# Dataset Card for Evaluation run of LordNoah/spin_gpt2_medium_alpaca_e2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LordNoah/spin_gpt2_medium_alpaca_e2](https://huggingface.co/LordNoah/spin_gpt2_medium_alpaca_e2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LordNoah__spin_gpt2_medium_alpaca_e2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T11:33:21.183475](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__spin_gpt2_medium_alpaca_e2/blob/main/results_2024-02-18T11-33-21.183475.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27025146447288184,
"acc_stderr": 0.031184084314290456,
"acc_norm": 0.27228177218315647,
"acc_norm_stderr": 0.03200244558578517,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.4151892085266857,
"mc2_stderr": 0.01437033424307639
},
"harness|arc:challenge|25": {
"acc": 0.23122866894197952,
"acc_stderr": 0.012320858834772278,
"acc_norm": 0.28071672354948807,
"acc_norm_stderr": 0.013131238126975583
},
"harness|hellaswag|10": {
"acc": 0.3308105954989046,
"acc_stderr": 0.004695434103958509,
"acc_norm": 0.3988249352718582,
"acc_norm_stderr": 0.004886559008754979
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.0391545063041425,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.0391545063041425
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2847222222222222,
"acc_stderr": 0.03773809990686935,
"acc_norm": 0.2847222222222222,
"acc_norm_stderr": 0.03773809990686935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.32413793103448274,
"acc_stderr": 0.03900432069185554,
"acc_norm": 0.32413793103448274,
"acc_norm_stderr": 0.03900432069185554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302054,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302054
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.02479011845933221,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.02479011845933221
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139405,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139405
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3838383838383838,
"acc_stderr": 0.03464881675016338,
"acc_norm": 0.3838383838383838,
"acc_norm_stderr": 0.03464881675016338
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.023661296393964273,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.023661296393964273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230186,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230186
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275886,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275886
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.03338473403207401,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.03338473403207401
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035303,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.1210762331838565,
"acc_stderr": 0.021894174113185737,
"acc_norm": 0.1210762331838565,
"acc_norm_stderr": 0.021894174113185737
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.1487603305785124,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.1487603305785124,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16964285714285715,
"acc_stderr": 0.0356236785009539,
"acc_norm": 0.16964285714285715,
"acc_norm_stderr": 0.0356236785009539
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.027778835904935437,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.027778835904935437
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21839080459770116,
"acc_stderr": 0.0147743583199345,
"acc_norm": 0.21839080459770116,
"acc_norm_stderr": 0.0147743583199345
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28034682080924855,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.28034682080924855,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26140808344198174,
"acc_stderr": 0.01122252816977131,
"acc_norm": 0.26140808344198174,
"acc_norm_stderr": 0.01122252816977131
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877743,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.01755581809132228,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.01755581809132228
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.363265306122449,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.363265306122449,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.4151892085266857,
"mc2_stderr": 0.01437033424307639
},
"harness|winogrande|5": {
"acc": 0.5367008681925809,
"acc_stderr": 0.014014578458843258
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.001071779348549268
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
maveriq/medi | ---
dataset_info:
features:
- name: query
sequence: string
- name: pos
sequence: string
- name: neg
sequence: string
- name: task_name
dtype: string
splits:
- name: train
num_bytes: 2572523114
num_examples: 1435000
download_size: 1232020798
dataset_size: 2572523114
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- feature-extraction
language:
- en
pretty_name: Multitask Embeddings Data with Instructions (MEDI)
size_categories:
- 1M<n<10M
---
# Disclaimer
I am not the author of the dataset or the paper. I have just uploaded it for ease of availability. For all information please refer to the [website](https://instructor-embedding.github.io/)
# Dataset Card for "medi"
The MEDI data consists of a collection of 330 datasets from Super-NI(Super-NaturalInstructions), sentence-transformer embedding training data, and KILT, spanning a wide range of domains and tasks.
If you use the dataset, please cite the following papers including Su et al., 2022, Wang et al., 2022, Petroni et al., 2021 and sentence transformer embedding training data at https://huggingface.co/datasets/sentence-transformers/embedding-training-data.
# Citation Information
```
@inproceedings{INSTRUCTOR,
title={One Embedder, Any Task: Instruction-Finetuned Text Embeddings},
author={Hongjin Su, Weijia Shi, Jungo Kasai, Yizhong Wang, Yushi Hu, Mari Ostendorf, Wen-tau Yih, Noah A. Smith, Luke Zettlemoyer, Tao Yu},
url={https://arxiv.org/abs/2212.09741},
year={2022},
}
@inproceedings{wang2022super,
title={Super-naturalinstructions: generalization via declarative instructions on 1600+ tasks},
author={Wang, Yizhong and Mishra, Swaroop and Alipoormolabashi, Pegah and Kordi, Yeganeh and Mirzaei, Amirreza and Arunkumar, Anjana and Ashok, Arjun and Dhanasekaran, Arut Selvan and Naik, Atharva and Stap, David and others},
year={2022},
organization={EMNLP}
}
@article{petroni2020kilt,
title={KILT: a benchmark for knowledge intensive language tasks},
author={Petroni, Fabio and Piktus, Aleksandra and Fan, Angela and Lewis, Patrick and Yazdani, Majid and De Cao, Nicola and Thorne, James and Jernite, Yacine and Karpukhin, Vladimir and Maillard, Jean and others},
journal={arXiv preprint arXiv:2009.02252},
year={2020}
}
``` |
bigscience-data/roots_indic-mr_wikipedia | ---
language: mr
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-mr_wikipedia
# wikipedia
- Dataset uid: `wikipedia`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 3.2299 % of total
- 4.2071 % of en
- 5.6773 % of ar
- 3.3416 % of fr
- 5.2815 % of es
- 12.4852 % of ca
- 0.4288 % of zh
- 0.4286 % of zh
- 5.4743 % of indic-bn
- 8.9062 % of indic-ta
- 21.3313 % of indic-te
- 4.4845 % of pt
- 4.0493 % of indic-hi
- 11.3163 % of indic-ml
- 22.5300 % of indic-ur
- 4.4902 % of vi
- 16.9916 % of indic-kn
- 24.7820 % of eu
- 11.6241 % of indic-mr
- 9.8749 % of id
- 9.3489 % of indic-pa
- 9.4767 % of indic-gu
- 24.1132 % of indic-as
- 5.3309 % of indic-or
### BigScience processing steps
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: ca
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: zh
#### Filters applied to: zh
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-as
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-or
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
|
benayas/atis_artificial_20pct_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 428438
num_examples: 4455
download_size: 139831
dataset_size: 428438
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LukeSajkowski/products_ecommerce_embeddings | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: short_description
dtype: string
- name: img_high
dtype: string
- name: supplier
dtype: string
- name: text
dtype: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 65690182
num_examples: 19406
download_size: 75241335
dataset_size: 65690182
---
# Dataset Card for "products_ecommerce_embeddings"
# The dataset is based on https://github.com/querqy/chorus/tree/main/data-encoder
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distil-whisper/librispeech_asr-timestamped | ---
license: cc-by-4.0
task_categories:
- automatic-speech-recognition
language:
- en
-pretty_name: LibriSpeech ASR
---
# Distil Whisper: LibriSpeech ASR With Timestamps
This is a variant of the [LibriSpeech ASR](https://huggingface.co/datasets/librispeech_asr) dataset, augmented to return the pseudo-labelled Whisper
Transcriptions alongside the original dataset elements. The pseudo-labelled transcriptions were generated by
labelling the input audio data with the Whisper [large-v2](https://huggingface.co/openai/whisper-large-v2)
model with *greedy* sampling and timestamp prediction. For information on how the original dataset was curated, refer to the original
[dataset card](https://huggingface.co/datasets/librispeech_asr).
## Standalone Usage
First, install the latest version of the 🤗 Datasets package:
```bash
pip install --upgrade pip
pip install --upgrade datasets[audio]
```
The dataset can be downloaded and pre-processed on disk using the [`load_dataset`](https://huggingface.co/docs/datasets/v2.14.5/en/package_reference/loading_methods#datasets.load_dataset)
function:
```python
from datasets import load_dataset
dataset = load_dataset("distil-whisper/librispeech_asr", "all")
# take the first sample of the validation set
sample = dataset["validation.clean"][0]
```
It can also be streamed directly from the Hub using Datasets' [streaming mode](https://huggingface.co/blog/audio-datasets#streaming-mode-the-silver-bullet).
Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire
dataset to disk:
```python
from datasets import load_dataset
dataset = load_dataset("distil-whisper/librispeech_asr", "all", streaming=True)
# take the first sample of the validation set
sample = next(iter(dataset["validation.clean"]))
```
## Distil Whisper Usage
To use this dataset to reproduce a Distil Whisper training run, refer to the instructions on the
[Distil Whisper repository](https://github.com/huggingface/distil-whisper#training).
## License
This dataset is licensed under cc-by-4.0.
|
nguyenphuthien/ViOpenHermes-2.5 | ---
license: mit
task_categories:
- conversational
- text-generation
language:
- vi
size_categories:
- 1M<n<10M
--- |
distilled-one-sec-cv12-each-chunk-uniq/chunk_42 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1256996156.0
num_examples: 244933
download_size: 1282330759
dataset_size: 1256996156.0
---
# Dataset Card for "chunk_42"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/wikiclir_it | ---
pretty_name: '`wikiclir/it`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikiclir/it`
The `wikiclir/it` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikiclir#wikiclir/it).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=1,347,011
- `queries` (i.e., topics); count=808,605
- `qrels`: (relevance assessments); count=3,443,633
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikiclir_it', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'text': ...}
queries = load_dataset('irds/wikiclir_it', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/wikiclir_it', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{sasaki-etal-2018-cross,
title = "Cross-Lingual Learning-to-Rank with Shared Representations",
author = "Sasaki, Shota and
Sun, Shuo and
Schamoni, Shigehiko and
Duh, Kevin and
Inui, Kentaro",
booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)",
month = jun,
year = "2018",
address = "New Orleans, Louisiana",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N18-2073",
doi = "10.18653/v1/N18-2073",
pages = "458--463"
}
```
|
Ediudo/alemaodacaravan | ---
license: openrail
---
|
CyberHarem/chiyoda_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of chiyoda/千代田/千代田 (Azur Lane)
This is the dataset of chiyoda/千代田/千代田 (Azur Lane), containing 31 images and their tags.
The core tags of this character are `breasts, red_hair, animal_ears, large_breasts, long_hair, purple_eyes, bangs, fox_ears, animal_ear_fluff, hair_ornament, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 31 | 62.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiyoda_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 31 | 35.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiyoda_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 81 | 73.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiyoda_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 31 | 55.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiyoda_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 81 | 111.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiyoda_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chiyoda_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | looking_at_viewer, 1girl, red_bikini, flower, solo, smile, blush, cleavage, collar, navel, red_eyes, side-tie_bikini_bottom, string_bikini, choker, day, bare_shoulders, hair_between_eyes, open_mouth, outdoors, sky |
| 1 | 9 |  |  |  |  |  | 1girl, fox_mask, looking_at_viewer, solo, wide_sleeves, cleavage, mask_on_head, white_thighhighs, detached_sleeves, armpits, red_skirt, tongue_out, full_body, kimono, sash |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | 1girl | red_bikini | flower | solo | smile | blush | cleavage | collar | navel | red_eyes | side-tie_bikini_bottom | string_bikini | choker | day | bare_shoulders | hair_between_eyes | open_mouth | outdoors | sky | fox_mask | wide_sleeves | mask_on_head | white_thighhighs | detached_sleeves | armpits | red_skirt | tongue_out | full_body | kimono | sash |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:-------------|:---------|:-------|:--------|:--------|:-----------|:---------|:--------|:-----------|:-------------------------|:----------------|:---------|:------|:-----------------|:--------------------|:-------------|:-----------|:------|:-----------|:---------------|:---------------|:-------------------|:-------------------|:----------|:------------|:-------------|:------------|:---------|:-------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | | | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
colbertv2/lotte_passages | ---
viewer: false
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: 'Lotte passages from ColBERTv2: Effective and Efficient Retrieval via
Lightweight Late Interaction'
size_categories:
- 1M<n<10M
source_datasets:
- original
tags: []
task_categories:
- question-answering
task_ids:
- extractive-qa
dataset_info:
features:
- name: doc_id
dtype: int32
- name: author
dtype: string
- name: text
dtype: string
splits:
- name: dev_collection
num_bytes: 263355925
num_examples: 268880
- name: test_collection
num_bytes: 105718627
num_examples: 119458
download_size: 225568795
dataset_size: 369074552
---
Passages for the LoTTe dataset used for [ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction](https://arxiv.org/abs/2112.01488) |
liuyanchen1015/MULTI_VALUE_sst2_reduced_relative | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 7663
num_examples: 48
- name: test
num_bytes: 11859
num_examples: 69
- name: train
num_bytes: 187208
num_examples: 1351
download_size: 112012
dataset_size: 206730
---
# Dataset Card for "MULTI_VALUE_sst2_reduced_relative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
strombergnlp/dkstance | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- da
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- fact-checking
paperswithcode_id: dast
pretty_name: DAST
extra_gated_prompt: 'Warning: the data in this repository contains harmful content
(misinformative claims).'
tags:
- stance-detection
---
# Dataset Card for "dkstance / DAST"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://stromberg.ai/publication/jointrumourstanceandveracity/](https://stromberg.ai/publication/jointrumourstanceandveracity/)
- **Repository:** [https://figshare.com/articles/dataset/Danish_stance-annotated_Reddit_dataset/8217137](https://figshare.com/articles/dataset/Danish_stance-annotated_Reddit_dataset/8217137)
- **Paper:** [https://aclanthology.org/W19-6122/](https://aclanthology.org/W19-6122/)
- **Point of Contact:** [Leon Derczynski](https://github.com/leondz)
- **Size of downloaded dataset files:**
- **Size of the generated dataset:**
- **Total amount of disk used:**
### Dataset Summary
This is an SDQC stance-annotated Reddit dataset for the Danish language generated within a thesis project. The dataset consists of over 5000 comments structured as comment trees and linked to 33 source posts.
The dataset is applicable for supervised stance classification and rumour veracity prediction.
### Supported Tasks and Leaderboards
* Stance prediction
### Languages
## Dataset Structure
### Data Instances
#### DAST / dkstance
- **Size of downloaded dataset files:** 4.72 MiB
- **Size of the generated dataset:** 3.69 MiB
- **Total amount of disk used:** 8.41 MiB
An example of 'train' looks as follows.
```
{
'id': '1',
'native_id': 'ebwjq5z',
'text': 'Med de udfordringer som daginstitutionerne har med normeringer, og økonomi i det hele taget, synes jeg det er en vanvittig beslutning at prioritere skattebetalt vegansk kost i daginstitutionerne. Brug dog pengene på noget mere personale, og lad folk selv betale for deres individuelle kostønsker.',
'parent_id': 'a6o3us',
'parent_text': 'Mai Mercado om mad i daginstitutioner: Sund kost rimer ikke på veganer-mad',
'parent_stance': 0,
'source_id': 'a6o3us',
'source_text': 'Mai Mercado om mad i daginstitutioner: Sund kost rimer ikke på veganer-mad',
'source_stance': 0
}
```
### Data Fields
- `id`: a `string` feature.
- `native_id`: a `string` feature representing the native ID of the entry.
- `text`: a `string` of the comment text in which stance is annotated.
- `parent_id`: the `native_id` of this comment's parent.
- `parent_text`: a `string` of the parent comment's text.
- `parent_stance`: the label of the stance in the comment towards its parent comment.
```
0: "Supporting",
1: "Denying",
2: "Querying",
3: "Commenting",
```
- `source_id`: the `native_id` of this comment's source / post.
- `source_text`: a `string` of the source / post text.
- `source_stance`: the label of the stance in the comment towards the original source post.
```
0: "Supporting",
1: "Denying",
2: "Querying",
3: "Commenting",
```
### Data Splits
| name |size|
|---------|----:|
|train|3122|
|validation|1066|
|test|1060|
These splits are specified after the original reserach was reported. The splits add an extra level of rigour, in that no source posts' comment tree is spread over more than one partition.
## Dataset Creation
### Curation Rationale
Comments around rumourous claims to enable rumour and stance analysis in Danish
### Source Data
#### Initial Data Collection and Normalization
The data is from Reddit posts that relate to one of a specific set of news stories; these stories are enumerated in the paper.
#### Who are the source language producers?
Danish-speaking Twitter users.
### Annotations
#### Annotation process
There was multi-user annotation process mediated through a purpose-built interface for annotating stance in Reddit threads.
#### Who are the annotators?
* Age: 20-30.
* Gender: male.
* Race/ethnicity: white northern European.
* Native language: Danish.
* Socioeconomic status: higher education student.
### Personal and Sensitive Information
The data was public at the time of collection. User names are not preserved.
## Considerations for Using the Data
### Social Impact of Dataset
There's a risk of user-deleted content being in this data. The data has NOT been vetted for any content, so there's a risk of harmful text.
### Discussion of Biases
The source of the text has a strong demographic bias, being mostly young white men who are vocal their opinions. This constrains both the styles of language and discussion contained in the data, as well as the topics discussed and viewpoints held.
### Other Known Limitations
The above limitations apply.
## Additional Information
### Dataset Curators
The dataset is curated by the paper's authors.
### Licensing Information
The authors distribute this data under Creative Commons attribution license, CC-BY 4.0.
An NLP data statement is included in the paper describing the work, [https://aclanthology.org/W19-6122.pdf](https://aclanthology.org/W19-6122.pdf)
### Citation Information
```
@inproceedings{lillie-etal-2019-joint,
title = "Joint Rumour Stance and Veracity Prediction",
author = "Lillie, Anders Edelbo and
Middelboe, Emil Refsgaard and
Derczynski, Leon",
booktitle = "Proceedings of the 22nd Nordic Conference on Computational Linguistics",
month = sep # "{--}" # oct,
year = "2019",
address = "Turku, Finland",
publisher = {Link{\"o}ping University Electronic Press},
url = "https://aclanthology.org/W19-6122",
pages = "208--221",
}
```
### Contributions
Author-added dataset [@leondz](https://github.com/leondz)
|
MatsuoDochiai/kauan4.0 | ---
license: openrail
---
|
result-kand2-sdxl-wuerst-karlo/e0cc5f8f | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 154
num_examples: 10
download_size: 1307
dataset_size: 154
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "e0cc5f8f"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Marcis/Cleiton | ---
license: apache-2.0
---
|
pacovaldez/predicted-stackoverflow | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: question_title
dtype: string
- name: question_body
dtype: string
- name: accepted_answer_id
dtype: int64
- name: question_creation_date
dtype: timestamp[us]
- name: question_answer_count
dtype: int64
- name: question_favorite_count
dtype: float64
- name: question_score
dtype: int64
- name: question_view_count
dtype: int64
- name: tags
dtype: string
- name: answer_body
dtype: string
- name: answer_creation_date
dtype: timestamp[us]
- name: answer_score
dtype: int64
- name: link
dtype: string
- name: context
dtype: string
- name: answer_start
dtype: int64
- name: answer_end
dtype: int64
- name: question
dtype: string
- name: predicted_answer
dtype: string
- name: parsed_answer
dtype: string
splits:
- name: train
num_bytes: 4777686
num_examples: 100
download_size: 2244820
dataset_size: 4777686
---
# Dataset Card for "predicted-stackoverflow"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
miluELK/pokemon-512-valid | ---
dataset_info:
features:
- name: image
dtype: image
---
# Dataset Card for "pokemon-512-valid"
A cleaned + upsampled-to-512px-square version of https://www.kaggle.com/datasets/djilax/pkmn-image-dataset, suitable for training high-resolution unconditional image generators.
source from [madebyollin/pokemon-512](https://huggingface.co/datasets/madebyollin/pokemon-512)
80% train_dataset + 10% test_dataset + 10% valid_dataset
I use the following code to split it
```python
from datasets import load_dataset, DatasetDict,Dataset
images_dataset = load_dataset('madebyollin/pokemon-512', split="train")
# 80% train_dataset + 20% train_testvalid
train_testvalid = images_dataset.train_test_split(test_size=0.2,shuffle=True,seed=2000)
# 10% test_dataset + 10% valid_dataset
test_valid = train_testvalid['test'].train_test_split(test_size=0.5,shuffle=True,seed=2000)
train_dev_test_dataset = DatasetDict({
'train': train_testvalid['train'],
'test': test_valid['train'],
'validation': test_valid['test']})
print(train_dev_test_dataset)
train_dataset = train_dev_test_dataset["train"]
test_dataset = train_dev_test_dataset["test"]
valid_dataset = train_dev_test_dataset["validation"]
train_dataset.to_parquet("./data/train_dataset.parquet")
test_dataset.to_parquet("./data/test_dataset.parquet")
valid_dataset.to_parquet("./data/valid_dataset.parquet")
```
I customed a "train_unconditional.py" from diffusers,logging "validation_loss" while training,
and added a module to caculate the FID score by using test_dataset.
|
johannes-garstenauer/balanced_structs_reduced_labelled | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 37902248.0
num_examples: 115648
download_size: 9513025
dataset_size: 37902248.0
---
# Dataset Card for "balanced_structs_reduced_labelled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rewcifer/validation_2000_cutoff_llama_2k_batch_train_2_results_test | ---
dataset_info:
features:
- name: labels_and_findings
dtype: string
- name: prompts
dtype: string
- name: true_findings
dtype: string
- name: generated_texts
dtype: string
splits:
- name: train
num_bytes: 17471501
num_examples: 2000
download_size: 4257850
dataset_size: 17471501
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "validation_2000_cutoff_llama_2k_batch_train_2_results_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falcon2006VN/pascal-code-generation-2mb | ---
license: mit
---
|
joey234/mmlu-high_school_computer_science-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 44363
num_examples: 100
download_size: 26576
dataset_size: 44363
---
# Dataset Card for "mmlu-high_school_computer_science-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Monkaro/Man-Regularisation | ---
license: unknown
---
|
Sachinkelenjaguri/autotrain-data-llm-finetune | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: autotrain_text
dtype: string
splits:
- name: train
num_bytes: 36857797
num_examples: 41601
- name: validation
num_bytes: 9232314
num_examples: 10401
download_size: 24348656
dataset_size: 46090111
---
# Dataset Card for "autotrain-data-llm-finetune"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AnoopChandra/orca-cleaned-simple-data | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 20165050
num_examples: 9120
download_size: 11238813
dataset_size: 20165050
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
blanchon/EuroSAT_MSI | ---
language: en
license: unknown
size_categories:
- 10K<n<100K
task_categories:
- image-classification
paperswithcode_id: eurosat
pretty_name: EuroSAT MSI
tags:
- remote-sensing
- earth-observation
- geospatial
- satellite-imagery
- land-cover-classification
- multispectral
- sentinel-2
dataset_info:
features:
- name: image
dtype:
array3_d:
dtype: uint16
shape:
- 64
- 64
- 13
- name: label
dtype:
class_label:
names:
'0': Annual Crop
'1': Forest
'2': Herbaceous Vegetation
'3': Highway
'4': Industrial Buildings
'5': Pasture
'6': Permanent Crop
'7': Residential Buildings
'8': River
'9': SeaLake
- name: filename
dtype: string
splits:
- name: train
num_bytes: 1995359806
num_examples: 16200
- name: test
num_bytes: 665119564
num_examples: 5400
- name: validation
num_bytes: 665120060
num_examples: 5400
download_size: 2379014584
dataset_size: 3325599430
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
# EuroSAT MSI
<!-- Dataset thumbnail -->

<!-- Provide a quick summary of the dataset. -->
EUROSAT is a classification dataset based on Sentinel-2 satellite images covering 13 spectral bands and consisting of 10 classes with 27000 labeled and geo-referenced samples.
- **Paper:** https://arxiv.org/abs/1709.00029
- **Homepage:** https://github.com/phelber/EuroSAT
## Description
<!-- Provide a longer summary of what this dataset is. -->
The EuroSAT dataset is a comprehensive land cover classification dataset that focuses on images taken by the [ESA Sentinel-2 satellite](https://sentinel.esa.int/web/sentinel/missions/sentinel-2). It contains a total of 27,000 images, each with a resolution of 64x64 pixels. These images cover 10 distinct land cover classes and are collected from over 34 European countries.
The dataset is available in two versions: RGB only and **all 13** (this repo) [Multispectral (MS) Sentinel-2 bands](https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-2-msi/resolutions/spatial). EuroSAT is considered a relatively easy dataset, with approximately 98.6% accuracy achievable using a ResNet-50 architecture.
- **Total Number of Images**: 27000
- **Bands**: 13 (MSI)
- **Image Resolution**: 64x64m
- **Land Cover Classes**: 10
- Classes: Annual Crop, Forest, Herbaceous Vegetation, Highway, Industrial Buildings, Pasture, Permanent Crop, Residential Buildings, River, SeaLake
## Usage
To use this dataset, simply use `datasets.load_dataset("blanchon/EuroSAT_MSI")`.
<!-- Provide any additional information on how to use this dataset. -->
```python
from datasets import load_dataset
EuroSAT_MSI = load_dataset("blanchon/EuroSAT_MSI")
```
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
If you use the EuroSAT dataset in your research, please consider citing the following publication:
```bibtex
@article{helber2017eurosat,
title={EuroSAT: A Novel Dataset and Deep Learning Benchmark for Land Use and Land Cover Classification},
author={Helber, et al.},
journal={ArXiv preprint arXiv:1709.00029},
year={2017}
}
```
|
pequeno3d/zeus | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_stsb_drop_aux_wh | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1283
num_examples: 7
- name: test
num_bytes: 922
num_examples: 10
- name: train
num_bytes: 7122
num_examples: 59
download_size: 14225
dataset_size: 9327
---
# Dataset Card for "MULTI_VALUE_stsb_drop_aux_wh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manishiitg/camel-ai-physics | ---
dataset_info:
features:
- name: system
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 173711856
num_examples: 40000
download_size: 57766434
dataset_size: 173711856
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_microsoft__Orca-2-7b | ---
pretty_name: Evaluation run of microsoft/Orca-2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [microsoft/Orca-2-7b](https://huggingface.co/microsoft/Orca-2-7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_microsoft__Orca-2-7b_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T08:52:22.157398](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Orca-2-7b_public/blob/main/results_2023-11-23T08-52-22.157398.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5591515182783672,\n\
\ \"acc_stderr\": 0.03362651811696442,\n \"acc_norm\": 0.5666849678033645,\n\
\ \"acc_norm_stderr\": 0.03437864006901342,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5244663206388774,\n\
\ \"mc2_stderr\": 0.016012530609803507,\n \"em\": 0.3205746644295302,\n\
\ \"em_stderr\": 0.004779419137797957,\n \"f1\": 0.43866505872483647,\n\
\ \"f1_stderr\": 0.004557698070527672\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5119453924914675,\n \"acc_stderr\": 0.014607220340597171,\n\
\ \"acc_norm\": 0.5409556313993175,\n \"acc_norm_stderr\": 0.01456229107360123\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5828520215096594,\n\
\ \"acc_stderr\": 0.004920800313232742,\n \"acc_norm\": 0.7619000199163514,\n\
\ \"acc_norm_stderr\": 0.004250501643743773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206824,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206824\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.03807301726504513,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.03807301726504513\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6161290322580645,\n \"acc_stderr\": 0.02766618207553964,\n \"\
acc_norm\": 0.6161290322580645,\n \"acc_norm_stderr\": 0.02766618207553964\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4039408866995074,\n \"acc_stderr\": 0.03452453903822039,\n \"\
acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.03452453903822039\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.025310639254933882,\n\
\ \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.025310639254933882\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501628,\n \"\
acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501628\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.756066411238825,\n\
\ \"acc_stderr\": 0.015357212665829468,\n \"acc_norm\": 0.756066411238825,\n\
\ \"acc_norm_stderr\": 0.015357212665829468\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584183,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34413407821229053,\n\
\ \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.34413407821229053,\n\
\ \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.02773283435336394,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.02773283435336394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581986,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581986\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001872,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001872\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596154,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596154\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.408735332464146,\n\
\ \"acc_stderr\": 0.012555701346703385,\n \"acc_norm\": 0.408735332464146,\n\
\ \"acc_norm_stderr\": 0.012555701346703385\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5441176470588235,\n \"acc_stderr\": 0.020148939420415745,\n \
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.020148939420415745\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5244663206388774,\n\
\ \"mc2_stderr\": 0.016012530609803507\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.3205746644295302,\n \
\ \"em_stderr\": 0.004779419137797957,\n \"f1\": 0.43866505872483647,\n\
\ \"f1_stderr\": 0.004557698070527672\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.1470811220621683,\n \"acc_stderr\": 0.009756063660359875\n\
\ }\n}\n```"
repo_url: https://huggingface.co/microsoft/Orca-2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|arc:challenge|25_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|arc:challenge|25_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|drop|3_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|drop|3_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|gsm8k|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|gsm8k|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hellaswag|10_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hellaswag|10_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T08-25-14.186190.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T08-52-22.157398.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T08-52-22.157398.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- '**/details_harness|winogrande|5_2023-11-23T08-25-14.186190.parquet'
- split: 2023_11_23T08_52_22.157398
path:
- '**/details_harness|winogrande|5_2023-11-23T08-52-22.157398.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T08-52-22.157398.parquet'
- config_name: results
data_files:
- split: 2023_11_23T08_25_14.186190
path:
- results_2023-11-23T08-25-14.186190.parquet
- split: 2023_11_23T08_52_22.157398
path:
- results_2023-11-23T08-52-22.157398.parquet
- split: latest
path:
- results_2023-11-23T08-52-22.157398.parquet
---
# Dataset Card for Evaluation run of microsoft/Orca-2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/microsoft/Orca-2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [microsoft/Orca-2-7b](https://huggingface.co/microsoft/Orca-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_microsoft__Orca-2-7b_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T08:52:22.157398](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Orca-2-7b_public/blob/main/results_2023-11-23T08-52-22.157398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5591515182783672,
"acc_stderr": 0.03362651811696442,
"acc_norm": 0.5666849678033645,
"acc_norm_stderr": 0.03437864006901342,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5244663206388774,
"mc2_stderr": 0.016012530609803507,
"em": 0.3205746644295302,
"em_stderr": 0.004779419137797957,
"f1": 0.43866505872483647,
"f1_stderr": 0.004557698070527672
},
"harness|arc:challenge|25": {
"acc": 0.5119453924914675,
"acc_stderr": 0.014607220340597171,
"acc_norm": 0.5409556313993175,
"acc_norm_stderr": 0.01456229107360123
},
"harness|hellaswag|10": {
"acc": 0.5828520215096594,
"acc_stderr": 0.004920800313232742,
"acc_norm": 0.7619000199163514,
"acc_norm_stderr": 0.004250501643743773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206824,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206824
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504513,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504513
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.02766618207553964,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.02766618207553964
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.03452453903822039,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.03452453903822039
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.025310639254933882,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.025310639254933882
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501628,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501628
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.756066411238825,
"acc_stderr": 0.015357212665829468,
"acc_norm": 0.756066411238825,
"acc_norm_stderr": 0.015357212665829468
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584183,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34413407821229053,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.34413407821229053,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.02773283435336394,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.02773283435336394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581986,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581986
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001872,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001872
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596154,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596154
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.408735332464146,
"acc_stderr": 0.012555701346703385,
"acc_norm": 0.408735332464146,
"acc_norm_stderr": 0.012555701346703385
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.020148939420415745,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.020148939420415745
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5244663206388774,
"mc2_stderr": 0.016012530609803507
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.01240654946619286
},
"harness|drop|3": {
"em": 0.3205746644295302,
"em_stderr": 0.004779419137797957,
"f1": 0.43866505872483647,
"f1_stderr": 0.004557698070527672
},
"harness|gsm8k|5": {
"acc": 0.1470811220621683,
"acc_stderr": 0.009756063660359875
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
thiagolira/LatinYoutube | ---
language:
- la
license: afl-3.0
pretty_name: Latin Youtube
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: start_time
dtype: int32
- name: end_time
dtype: int32
- name: channel
dtype: string
- name: is_corrected
dtype: bool
splits:
- name: train
num_bytes: 825247203.4571464
num_examples: 2839
download_size: 928950849
dataset_size: 825247203.4571464
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This is a dataset with text/audio pairs of Classical Latin extracted from youtube videos from the channels [Scorpio Martianus](https://www.youtube.com/@ScorpioMartianus), [LATINITIUS](https://www.youtube.com/@Latinitium) and [Musa Pedestris](https://www.youtube.com/@MusaPedestris) |
anti-ai/vi_news_wseg | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 7475175746
num_examples: 1538904
download_size: 3805413579
dataset_size: 7475175746
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
language:
- vi
pretty_name: vinews
size_categories:
- 1M<n<10M
---
# Dataset Card for "vi_news_wseg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sjdata/single_speaker_en_test_librivox | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: normalized_text
dtype: string
splits:
- name: train
num_bytes: 20226057306.427
num_examples: 139411
download_size: 1857190033
dataset_size: 20226057306.427
---
# Dataset Card for "single_speaker_en_test_librivox"
# Created for testing, not suggested for production
#### Dataset Summary
The corpus consists of a single speaker extracted frrom LibriVox audiobook.
#### Languages
The audio is in English.
#### Source Data
Initial Data Collection and Normalization
The voices used in my Datasets are volenteers who have donated their time and voices to open source LibriVox projects. Please respect their privacy.
#### Licensing Information
MIT
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
danielz01/laion-5b | ---
license: cc-by-4.0
task_categories:
- feature-extraction
- zero-shot-classification
language:
- en
size_categories:
- 1B<n<10B
--- |
open-llm-leaderboard/details_InferenceIllusionist__Excalibur-7b-DPO | ---
pretty_name: Evaluation run of InferenceIllusionist/Excalibur-7b-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [InferenceIllusionist/Excalibur-7b-DPO](https://huggingface.co/InferenceIllusionist/Excalibur-7b-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_InferenceIllusionist__Excalibur-7b-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-28T06:01:53.992926](https://huggingface.co/datasets/open-llm-leaderboard/details_InferenceIllusionist__Excalibur-7b-DPO/blob/main/results_2024-03-28T06-01-53.992926.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6585551254608566,\n\
\ \"acc_stderr\": 0.03185272081159569,\n \"acc_norm\": 0.6593615646245452,\n\
\ \"acc_norm_stderr\": 0.03250028579692,\n \"mc1\": 0.5348837209302325,\n\
\ \"mc1_stderr\": 0.017460849975873972,\n \"mc2\": 0.7081813831814938,\n\
\ \"mc2_stderr\": 0.014609886961389094\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.013678810399518824,\n\
\ \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907595\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7030472017526389,\n\
\ \"acc_stderr\": 0.004559817589182069,\n \"acc_norm\": 0.8793069109739096,\n\
\ \"acc_norm_stderr\": 0.0032510448518843103\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n\
\ \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n\
\ \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n\
\ \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n\
\ \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n\
\ \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"\
acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"\
acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.023559646983189946,\n\
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.023559646983189946\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507337,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507337\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601457,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601457\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608311,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608311\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525817,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525817\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n\
\ \"acc_stderr\": 0.016525425898773503,\n \"acc_norm\": 0.423463687150838,\n\
\ \"acc_norm_stderr\": 0.016525425898773503\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533127,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533127\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528176,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528176\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160896,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160896\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5348837209302325,\n\
\ \"mc1_stderr\": 0.017460849975873972,\n \"mc2\": 0.7081813831814938,\n\
\ \"mc2_stderr\": 0.014609886961389094\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.824782951854775,\n \"acc_stderr\": 0.010684179227706168\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6542835481425322,\n \
\ \"acc_stderr\": 0.013100422990441568\n }\n}\n```"
repo_url: https://huggingface.co/InferenceIllusionist/Excalibur-7b-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|arc:challenge|25_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|gsm8k|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hellaswag|10_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-28T06-01-53.992926.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-28T06-01-53.992926.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- '**/details_harness|winogrande|5_2024-03-28T06-01-53.992926.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-28T06-01-53.992926.parquet'
- config_name: results
data_files:
- split: 2024_03_28T06_01_53.992926
path:
- results_2024-03-28T06-01-53.992926.parquet
- split: latest
path:
- results_2024-03-28T06-01-53.992926.parquet
---
# Dataset Card for Evaluation run of InferenceIllusionist/Excalibur-7b-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [InferenceIllusionist/Excalibur-7b-DPO](https://huggingface.co/InferenceIllusionist/Excalibur-7b-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_InferenceIllusionist__Excalibur-7b-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-28T06:01:53.992926](https://huggingface.co/datasets/open-llm-leaderboard/details_InferenceIllusionist__Excalibur-7b-DPO/blob/main/results_2024-03-28T06-01-53.992926.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6585551254608566,
"acc_stderr": 0.03185272081159569,
"acc_norm": 0.6593615646245452,
"acc_norm_stderr": 0.03250028579692,
"mc1": 0.5348837209302325,
"mc1_stderr": 0.017460849975873972,
"mc2": 0.7081813831814938,
"mc2_stderr": 0.014609886961389094
},
"harness|arc:challenge|25": {
"acc": 0.6757679180887372,
"acc_stderr": 0.013678810399518824,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907595
},
"harness|hellaswag|10": {
"acc": 0.7030472017526389,
"acc_stderr": 0.004559817589182069,
"acc_norm": 0.8793069109739096,
"acc_norm_stderr": 0.0032510448518843103
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328972,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328972
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.023559646983189946,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.023559646983189946
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507337,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507337
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601457,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601457
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608311,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608311
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525817,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.423463687150838,
"acc_stderr": 0.016525425898773503,
"acc_norm": 0.423463687150838,
"acc_norm_stderr": 0.016525425898773503
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533127,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533127
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528176,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528176
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160896,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160896
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5348837209302325,
"mc1_stderr": 0.017460849975873972,
"mc2": 0.7081813831814938,
"mc2_stderr": 0.014609886961389094
},
"harness|winogrande|5": {
"acc": 0.824782951854775,
"acc_stderr": 0.010684179227706168
},
"harness|gsm8k|5": {
"acc": 0.6542835481425322,
"acc_stderr": 0.013100422990441568
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
josiauhlol/fsGPT | ---
language: en
license: openrail
pretty_name: freesmartGPT
task_categories:
- conversational
tags:
- ai
---
# fsGPT |
heliosprime/twitter_dataset_1713226003 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 18744
num_examples: 54
download_size: 17388
dataset_size: 18744
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713226003"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yukiamenta/Meus_modelos | ---
license: openrail
---
|
tmnam20/VietnameseBookCorpus-raw-parquet | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4928921669
num_examples: 19287
download_size: 2543402734
dataset_size: 4928921669
---
# Dataset Card for "VietnameseBookCorpus-raw-parquet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zicsx/ai4bharat-hi-subset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 80196074466
num_examples: 106391910
download_size: 6800633717
dataset_size: 80196074466
license: apache-2.0
task_categories:
- text-generation
language:
- hi
size_categories:
- 100M<n<1B
---
# Dataset Card for "ai4bharat-hi-subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlexRog228/dreambooth-hackathon-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 4326507.0
num_examples: 20
download_size: 4310996
dataset_size: 4326507.0
---
# Dataset Card for "dreambooth-hackathon-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shidowake/hrms-shuffle-subset_100k | ---
dataset_info:
features:
- name: idx
dtype: string
- name: source
dtype: string
- name: custom_instruction
dtype: bool
- name: category
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: float64
- name: language
dtype: string
- name: id
dtype: string
- name: topic
dtype: string
- name: model
dtype: string
- name: hash
sequence: int64
- name: avatarUrl
dtype: string
- name: model_name
dtype: string
- name: views
dtype: int64
- name: skip_prompt_formatting
dtype: bool
- name: title
dtype: string
- name: system_prompt
dtype: string
splits:
- name: train
num_bytes: 168120556.31715208
num_examples: 100000
download_size: 85705629
dataset_size: 168120556.31715208
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vietgpt/phomt | ---
dataset_info:
features:
- name: vi
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 536891701
num_examples: 2977999
download_size: 314970470
dataset_size: 536891701
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "phomt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
castorini/msmarco_v1_doc_doc2query-t5_expansions | ---
language:
- en
license: apache-2.0
---
# Dataset Summary
The repo provides queries generated for the MS MARCO V1 document corpus with docTTTTTquery (sometimes written as docT5query or doc2query-T5), the latest version of the doc2query family of document expansion models. The basic idea is to train a model, that when given an input document, generates questions that the document might answer (or more broadly, queries for which the document might be relevant). These predicted questions (or queries) are then appended to the original documents, which are then indexed as before. The docTTTTTquery model gets its name from the use of T5 as the expansion model.
# Dataset Structure
All three folds (train, dev and test) share the same corpus.
An example data entry looks as follows:
```
{
"id": "D1555982",
"predicted_queries": ["when find radius of star r", "what is r radius", "how to find out radius of star", "what is radius r", "what is radius of r", "how do you find radius of star igel", "which law states that radiation is proportional to radiation?", "what is the radius of a spherical star", "what is the radius of the star", "what is radius of star", "which radiation is produced during a solar radiation experiment?", "how to find radius r", "what is radius r of a star", "the hot glowing surfaces of stars emit energy in the form of", "what is the radius of a star", "what is the radius of a star", "how to find radius r on a star", "how to find radius r in a solar cell", "what kind of energy does a hot glowing surface of a star emit?", "what kind of energy does the hot glowing surface of stars emit"]
}
```
# Load Dataset
An example to load the dataset:
```
dataset = load_dataset('castorini/msmarco_v1_doc_doc2query-t5_expansions')
```
# Citation Information
```
@article{docTTTTTquery,
title={From doc2query to {docTTTTTquery}},
author={Nogueira, Rodrigo and Lin, Jimmy},
year={2019}
}
@article{emdt5,
author = "Ronak Pradeep and Rodrigo Nogueira and Jimmy Lin",
title = "The Expando-Mono-Duo Design Pattern for Text Ranking with Pretrained Sequence-to-Sequence Models",
journal = "arXiv:2101.05667",
year = 2021,
}
|
c4ba/teste55345 | ---
license: openrail
---
|
jp1924/VisualQuestionAnswering | ---
dataset_info:
features:
- name: question_id
dtype: string
- name: image_id
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answer_confidence
dtype: string
- name: question
dtype: string
- name: category
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 362397727602.125
num_examples: 4015127
- name: validation
num_bytes: 143893892418.375
num_examples: 1735397
download_size: 96944176039
dataset_size: 506291620020.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
task_categories:
- visual-question-answering
tags:
- Caption
- Image
size_categories:
- 10B<n<100B
language:
- ko
---
# 시각정보 기반 질의응답
[AIHub](https://aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&aihubDataSe=realm&dataSetSn=104)
[builder_code](https://github.com/jp1924/HF_builders/tree/main) |
juanhebert/sv_corpora_parliament_processed | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 292359009
num_examples: 1892723
download_size: 158940474
dataset_size: 292359009
---
# Dataset Card for "sv_corpora_parliament_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/code_instructions_standardized_embedded | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 836061468
num_examples: 136146
download_size: 0
dataset_size: 836061468
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_embedded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Medradome/Analucie | ---
license: apache-2.0
---
|
vwxyzjn/openhermes-dev-512__NousResearch_Nous-Hermes-2-Yi-34B__1707948601 | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: candidate0_policy
dtype: string
- name: candidate0
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1_policy
dtype: string
- name: candidate2
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate2_policy
dtype: string
splits:
- name: train
num_bytes: 55364922.0
num_examples: 10000
download_size: 29197048
dataset_size: 55364922.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
deepapaikar/Llama_13B_1600QA | ---
license: apache-2.0
---
|
linhqyy/Zalo_Corpus_2 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 11469054935.056
num_examples: 56427
download_size: 11653394835
dataset_size: 11469054935.056
---
# Dataset Card for "Zalo_Corpus_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arun2023acs/acsrepoind2023 | ---
license: mit
---
|
Falah/celebrity_art_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 163783
num_examples: 1000
download_size: 32791
dataset_size: 163783
---
# Dataset Card for "celebrity_art_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_81 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 112139860
num_examples: 11198
download_size: 33279620
dataset_size: 112139860
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_81"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_8 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 145562012
num_examples: 14666
download_size: 42803368
dataset_size: 145562012
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Doub7e/SDv2-Count-Repeated-2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: T5_last_hidden_states
sequence:
sequence:
sequence: float32
- name: style
dtype: string
splits:
- name: train
num_bytes: 1338281411.25
num_examples: 1150
download_size: 1146511831
dataset_size: 1338281411.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sezer12138/ADE20k_Segementation | ---
dataset_info:
features:
- name: image
dtype: image
- name: annotated
dtype: image
- name: Scene_category
dtype:
class_label:
names:
'0': abbey
'1': access_road
'2': acropolis
'3': air_base
'4': aircraft_carrier_object
'5': airfield
'6': airlock
'7': airplane
'8': airplane_cabin
'9': airport
'10': airport_terminal
'11': airport_ticket_counter
'12': alcove
'13': alley
'14': amphitheater
'15': amphitheater_indoor
'16': amusement_arcade
'17': amusement_park
'18': anechoic_chamber
'19': apartment_building_outdoor
'20': apse_indoor
'21': apse_outdoor
'22': aquarium
'23': aquatic_theater
'24': aqueduct
'25': arbor
'26': arcade
'27': arch
'28': archaelogical_excavation
'29': archipelago
'30': archive
'31': armory
'32': army_base
'33': arrival_gate_indoor
'34': arrival_gate_outdoor
'35': art_gallery
'36': art_school
'37': art_studio
'38': artificial
'39': artists_loft
'40': assembly_hall
'41': assembly_line
'42': assembly_plant
'43': athletic_field_indoor
'44': athletic_field_outdoor
'45': atrium_home
'46': atrium_public
'47': attic
'48': auditorium
'49': auto_factory
'50': auto_mechanics_indoor
'51': auto_mechanics_outdoor
'52': auto_racing_paddock
'53': auto_showroom
'54': awning_deck
'55': back_porch
'56': backdrop
'57': backroom
'58': backseat
'59': backstage
'60': backstage_outdoor
'61': backstairs
'62': backstairs_indoor
'63': backwoods
'64': badlands
'65': badminton_court_indoor
'66': badminton_court_outdoor
'67': baggage_claim
'68': balcony_interior
'69': ball_pit
'70': ballet
'71': ballroom
'72': balustrade
'73': bamboo_forest
'74': bank_indoor
'75': bank_outdoor
'76': bank_vault
'77': banquet_hall
'78': baptistry_indoor
'79': baptistry_outdoor
'80': bar
'81': barbeque
'82': barbershop
'83': barn
'84': barndoor
'85': barnyard
'86': barrack
'87': barrel_storage
'88': baseball
'89': baseball_field
'90': basement
'91': basilica
'92': basin_outdoor
'93': basketball
'94': basketball_court_indoor
'95': basketball_court_outdoor
'96': bath_indoor
'97': bath_outdoor
'98': bathhouse
'99': bathhouse_outdoor
'100': bathroom
'101': batters_box
'102': batting_cage_indoor
'103': batting_cage_outdoor
'104': battlefield
'105': battlement
'106': bay
'107': bayou
'108': bazaar_indoor
'109': bazaar_outdoor
'110': beach
'111': beach_house
'112': beauty_salon
'113': bedchamber
'114': bedroom
'115': beer_garden
'116': beer_hall
'117': belfry
'118': bell_foundry
'119': berth
'120': berth_deck
'121': betting_shop
'122': bicycle_racks
'123': bindery
'124': biology_laboratory
'125': bistro_indoor
'126': bistro_outdoor
'127': bleachers_indoor
'128': bleachers_outdoor
'129': block
'130': boardwalk
'131': boat
'132': boat_deck
'133': boathouse
'134': bog
'135': bomb_shelter_indoor
'136': bookbindery
'137': bookshelf
'138': bookstore
'139': booth
'140': booth_indoor
'141': booth_outdoor
'142': botanical_garden
'143': bottle_storage
'144': bottomland
'145': bow_window_indoor
'146': bow_window_outdoor
'147': bowling_alley
'148': box_seat
'149': boxing_ring
'150': breakfast_table
'151': breakroom
'152': brewery_indoor
'153': brewery_outdoor
'154': bric-a-brac
'155': brickyard_indoor
'156': brickyard_outdoor
'157': bridge
'158': bridle_path
'159': broadleaf
'160': brooklet
'161': bubble_chamber
'162': buffet
'163': building_complex
'164': building_facade
'165': bulkhead
'166': bullpen
'167': bullring
'168': bunk_bed
'169': burial_chamber
'170': bus_depot_indoor
'171': bus_depot_outdoor
'172': bus_interior
'173': bus_shelter
'174': bus_station_indoor
'175': bus_station_outdoor
'176': butchers_shop
'177': butte
'178': bypass
'179': byroad
'180': cabana
'181': cabin_cruiser
'182': cabin_indoor
'183': cabin_outdoor
'184': cafeteria
'185': call_center
'186': campsite
'187': campus
'188': candy_store
'189': canteen
'190': canyon
'191': car_dealership
'192': caravansary
'193': cardroom
'194': cargo_container_interior
'195': cargo_deck
'196': cargo_helicopter
'197': carport_indoor
'198': carport_outdoor
'199': carrousel
'200': cascade
'201': casino_indoor
'202': casino_outdoor
'203': castle
'204': catacomb
'205': cataract
'206': cathedral_indoor
'207': cathedral_outdoor
'208': catwalk
'209': cavern_indoor
'210': cavern_outdoor
'211': cellar
'212': cemetery
'213': chair_lift
'214': chalet
'215': chaparral
'216': chapel
'217': checkout_counter
'218': cheese_factory
'219': chemical_plant
'220': chemistry_lab
'221': chicken_coop_indoor
'222': chicken_coop_outdoor
'223': chicken_farm_indoor
'224': chicken_farm_outdoor
'225': childs_room
'226': choir_loft_interior
'227': chuck_wagon
'228': church_indoor
'229': church_outdoor
'230': circus_tent_indoor
'231': circus_tent_outdoor
'232': city
'233': classroom
'234': clean_room
'235': cliff
'236': clock_tower_indoor
'237': cloister_indoor
'238': cloister_outdoor
'239': closet
'240': clothing_store
'241': coast
'242': coast_road
'243': cockpit
'244': cocktail_lounge
'245': coffee_shop
'246': computer_room
'247': conference_center
'248': conference_hall
'249': conference_room
'250': confessional
'251': construction_site
'252': control_room
'253': control_tower_indoor
'254': control_tower_outdoor
'255': convenience_store_indoor
'256': convenience_store_outdoor
'257': coral_reef
'258': corn_field
'259': corner
'260': corral
'261': corridor
'262': cottage
'263': cottage_garden
'264': country_house
'265': country_road
'266': courthouse
'267': courtroom
'268': courtyard
'269': covered_bridge_interior
'270': crawl_space
'271': creek
'272': crevasse
'273': crosswalk
'274': cultivated
'275': customhouse
'276': cybercafe
'277': dacha
'278': dairy_indoor
'279': dairy_outdoor
'280': dam
'281': dance_floor
'282': dance_school
'283': darkroom
'284': day_care_center
'285': deck-house_boat_deck_house
'286': deck-house_deck_house
'287': delicatessen
'288': dentists_office
'289': department_store
'290': departure_lounge
'291': desert_road
'292': diner_indoor
'293': diner_outdoor
'294': dinette_home
'295': dining_area
'296': dining_car
'297': dining_hall
'298': dining_room
'299': dirt_track
'300': discotheque
'301': distillery
'302': ditch
'303': diving_board
'304': dock
'305': dolmen
'306': donjon
'307': door
'308': doorway_indoor
'309': doorway_outdoor
'310': dorm_room
'311': downtown
'312': drainage_ditch
'313': dress_shop
'314': dressing_room
'315': drill_rig
'316': driveway
'317': driving_range_indoor
'318': driving_range_outdoor
'319': drugstore
'320': dry
'321': dry_dock
'322': dugout
'323': earth_fissure
'324': east_asia
'325': editing_room
'326': electrical_substation
'327': elevated_catwalk
'328': elevator_interior
'329': elevator_lobby
'330': elevator_shaft
'331': embankment
'332': embassy
'333': embrasure
'334': engine_room
'335': entrance
'336': entrance_hall
'337': entranceway_indoor
'338': entranceway_outdoor
'339': entryway_outdoor
'340': escalator_indoor
'341': escalator_outdoor
'342': escarpment
'343': establishment
'344': estaminet
'345': estuary
'346': excavation
'347': exhibition_hall
'348': exterior
'349': fabric_store
'350': factory_indoor
'351': factory_outdoor
'352': fairway
'353': fan
'354': farm
'355': farm_building
'356': farmhouse
'357': fastfood_restaurant
'358': feed_bunk
'359': fence
'360': ferryboat_indoor
'361': field_house
'362': field_road
'363': field_tent_indoor
'364': field_tent_outdoor
'365': fire_escape
'366': fire_station
'367': fire_trench
'368': fireplace
'369': firing_range_indoor
'370': firing_range_outdoor
'371': fish_farm
'372': fishmarket
'373': fishpond
'374': fitting_room_interior
'375': fjord
'376': flashflood
'377': flatlet
'378': flea_market_indoor
'379': flea_market_outdoor
'380': floating_dock
'381': floating_dry_dock
'382': flood
'383': flood_plain
'384': florist_shop_indoor
'385': florist_shop_outdoor
'386': flowerbed
'387': flume_indoor
'388': fly_bridge
'389': flying_buttress
'390': food_court
'391': football
'392': football_field
'393': foothill
'394': forecourt
'395': foreshore
'396': forest_fire
'397': forest_path
'398': forest_road
'399': forklift
'400': formal_garden
'401': fort
'402': fortress
'403': foundry_indoor
'404': foundry_outdoor
'405': fountain
'406': freestanding
'407': freeway
'408': freight_elevator
'409': front_porch
'410': frontseat
'411': funeral_chapel
'412': funeral_home
'413': furnace_room
'414': galley
'415': game_room
'416': gangplank
'417': garage_indoor
'418': garage_outdoor
'419': garbage_dump
'420': garden
'421': gas_station
'422': gas_well
'423': gasworks
'424': gate
'425': gatehouse
'426': gazebo_interior
'427': general_store_indoor
'428': general_store_outdoor
'429': geodesic_dome_indoor
'430': geodesic_dome_outdoor
'431': ghost_town
'432': gift_shop
'433': glacier
'434': glade
'435': glen
'436': golf_course
'437': gorge
'438': granary
'439': grape_arbor
'440': great_hall
'441': greengrocery
'442': greenhouse_indoor
'443': greenhouse_outdoor
'444': grotto
'445': grove
'446': guardhouse
'447': guardroom
'448': guesthouse
'449': gulch
'450': gun_deck_indoor
'451': gun_deck_outdoor
'452': gun_store
'453': gymnasium_indoor
'454': gymnasium_outdoor
'455': hacienda
'456': hallway
'457': handball_court
'458': hangar_indoor
'459': hangar_outdoor
'460': harbor
'461': hardware_store
'462': hat_shop
'463': hatchery
'464': hayfield
'465': hayloft
'466': head_shop
'467': hearth
'468': heath
'469': hedge_maze
'470': hedgerow
'471': heliport
'472': hen_yard
'473': herb_garden
'474': highway
'475': hill
'476': hillock
'477': hockey
'478': hollow
'479': home_office
'480': home_theater
'481': hoodoo
'482': hospital
'483': hospital_room
'484': hot_spring
'485': hot_tub_indoor
'486': hot_tub_outdoor
'487': hotel_breakfast_area
'488': hotel_outdoor
'489': hotel_room
'490': house
'491': housing_estate
'492': housing_project
'493': howdah
'494': hunting_lodge_indoor
'495': hunting_lodge_outdoor
'496': hut
'497': hutment
'498': ice_cream_parlor
'499': ice_floe
'500': ice_shelf
'501': ice_skating_rink_indoor
'502': ice_skating_rink_outdoor
'503': iceberg
'504': igloo
'505': imaret
'506': incinerator_indoor
'507': incinerator_outdoor
'508': indoor_procenium
'509': indoor_round
'510': indoor_seats
'511': industrial_area
'512': industrial_park
'513': inlet
'514': inn_indoor
'515': inn_outdoor
'516': insane_asylum
'517': irrigation_ditch
'518': islet
'519': jacuzzi_indoor
'520': jacuzzi_outdoor
'521': jail_cell
'522': jail_indoor
'523': jail_outdoor
'524': japanese_garden
'525': jetty
'526': jewelry_shop
'527': joss_house
'528': juke_joint
'529': jungle
'530': junk_pile
'531': junkyard
'532': jury_box
'533': kasbah
'534': kennel_indoor
'535': kennel_outdoor
'536': kindergarden_classroom
'537': kiosk_indoor
'538': kiosk_outdoor
'539': kitchen
'540': kitchenette
'541': kraal
'542': lab_classroom
'543': laboratorywet
'544': labyrinth_indoor
'545': labyrinth_outdoor
'546': lagoon
'547': landfill
'548': landing
'549': landing_deck
'550': landing_strip
'551': laundromat
'552': lava_flow
'553': lavatory
'554': lawn
'555': layby
'556': lean-to
'557': lean-to_tent
'558': lecture_room
'559': legislative_chamber
'560': levee
'561': library
'562': library_indoor
'563': library_outdoor
'564': lido_deck_indoor
'565': lido_deck_outdoor
'566': lift_bridge
'567': lighthouse
'568': limousine_interior
'569': liquor_store_indoor
'570': liquor_store_outdoor
'571': living_room
'572': loading_dock
'573': lobby
'574': lock_chamber
'575': locker_room
'576': loft
'577': loge
'578': loggia_outdoor
'579': lookout_station_indoor
'580': lookout_station_outdoor
'581': lower_deck
'582': luggage_van
'583': lumberyard_indoor
'584': lumberyard_outdoor
'585': lyceum
'586': machine_shop
'587': manhole
'588': mansard
'589': mansion
'590': manufactured_home
'591': market_indoor
'592': market_outdoor
'593': marsh
'594': martial_arts_gym
'595': massage_room
'596': mastaba
'597': maternity_ward
'598': mausoleum
'599': meadow
'600': meat_house
'601': medina
'602': megalith
'603': menhir
'604': mens_store_outdoor
'605': mental_institution_indoor
'606': mental_institution_outdoor
'607': mesa
'608': mesoamerican
'609': mess_hall
'610': mews
'611': mezzanine
'612': military_headquarters
'613': military_hospital
'614': military_hut
'615': military_tent
'616': millpond
'617': millrace
'618': mine
'619': mineral_bath
'620': mineshaft
'621': mini_golf_course_indoor
'622': mini_golf_course_outdoor
'623': misc
'624': mission
'625': mobile_home
'626': monastery_indoor
'627': monastery_outdoor
'628': moon_bounce
'629': moor
'630': morgue
'631': mosque_indoor
'632': mosque_outdoor
'633': motel
'634': mountain
'635': mountain_path
'636': mountain_road
'637': mountain_snowy
'638': movie_theater_indoor
'639': movie_theater_outdoor
'640': mudflat
'641': museum_indoor
'642': museum_outdoor
'643': music_store
'644': music_studio
'645': natural
'646': natural_history_museum
'647': natural_spring
'648': naval_base
'649': needleleaf
'650': newsroom
'651': newsstand_indoor
'652': newsstand_outdoor
'653': nightclub
'654': nook
'655': nuclear_power_plant_indoor
'656': nuclear_power_plant_outdoor
'657': nunnery
'658': nursery
'659': nursing_home
'660': nursing_home_outdoor
'661': oasis
'662': oast_house
'663': observation_station
'664': observatory_indoor
'665': observatory_outdoor
'666': observatory_post
'667': ocean
'668': ocean_deep
'669': ocean_shallow
'670': office
'671': office_building
'672': office_cubicles
'673': oil_refinery_indoor
'674': oil_refinery_outdoor
'675': oilrig
'676': one-way_street
'677': open-hearth_furnace
'678': operating_room
'679': operating_table
'680': optician
'681': orchard
'682': orchestra_pit
'683': organ_loft_interior
'684': orlop_deck
'685': ossuary
'686': outbuilding
'687': outcropping
'688': outhouse_indoor
'689': outhouse_outdoor
'690': outside
'691': overpass
'692': oyster_bar
'693': oyster_farm
'694': packaging_plant
'695': pagoda
'696': palace
'697': palace_hall
'698': palestra
'699': pantry
'700': paper_mill
'701': parade_ground
'702': park
'703': parking_garage_indoor
'704': parking_garage_outdoor
'705': parking_lot
'706': parkway
'707': parlor
'708': particle_accelerator
'709': party_tent_indoor
'710': party_tent_outdoor
'711': passenger_deck
'712': pasture
'713': patio
'714': patio_indoor
'715': pavement
'716': pavilion
'717': pawnshop
'718': pawnshop_outdoor
'719': pedestrian_overpass_indoor
'720': penalty_box
'721': performance
'722': perfume_shop
'723': pet_shop
'724': pharmacy
'725': phone_booth
'726': physics_laboratory
'727': piano_store
'728': picnic_area
'729': pier
'730': pig_farm
'731': pilothouse_indoor
'732': pilothouse_outdoor
'733': pinetum
'734': piste_road
'735': pitchers_mound
'736': pizzeria
'737': pizzeria_outdoor
'738': planetarium_indoor
'739': planetarium_outdoor
'740': plantation_house
'741': platform
'742': playground
'743': playroom
'744': plaza
'745': plunge
'746': podium_indoor
'747': podium_outdoor
'748': police_station
'749': pond
'750': pontoon_bridge
'751': poolroom_home
'752': poop_deck
'753': porch
'754': portico
'755': portrait_studio
'756': postern
'757': powder_room
'758': power_plant_outdoor
'759': preserve
'760': print_shop
'761': priory
'762': promenade
'763': promenade_deck
'764': pub_indoor
'765': pub_outdoor
'766': pueblo
'767': pulpit
'768': pump_room
'769': pumping_station
'770': putting_green
'771': quadrangle
'772': questionable
'773': quicksand
'774': quonset_hut_indoor
'775': quonset_hut_outdoor
'776': racecourse
'777': raceway
'778': raft
'779': rail_indoor
'780': rail_outdoor
'781': railroad_track
'782': railway_yard
'783': rainforest
'784': ramp
'785': ranch
'786': ranch_house
'787': reading_room
'788': reception
'789': reception_room
'790': recreation_room
'791': rectory
'792': recycling_plant_indoor
'793': recycling_plant_outdoor
'794': refectory
'795': repair_shop
'796': residential_neighborhood
'797': resort
'798': rest_area
'799': rest_stop
'800': restaurant
'801': restaurant_kitchen
'802': restaurant_patio
'803': restroom_indoor
'804': restroom_outdoor
'805': retaining_wall
'806': revolving_door
'807': rice_paddy
'808': riding_arena
'809': rift_valley
'810': river
'811': road
'812': road_cut
'813': road_indoor
'814': road_outdoor
'815': rock_arch
'816': rock_garden
'817': rodeo
'818': roller_skating_rink_indoor
'819': roller_skating_rink_outdoor
'820': rolling_mill
'821': roof
'822': roof_garden
'823': room
'824': root_cellar
'825': rope_bridge
'826': rotisserie
'827': roundabout
'828': roundhouse
'829': rubble
'830': ruin
'831': runway
'832': sacristy
'833': safari_park
'834': salon
'835': saloon
'836': salt_plain
'837': sanatorium
'838': sand
'839': sand_trap
'840': sandbar
'841': sandbox
'842': sauna
'843': savanna
'844': sawmill
'845': schoolhouse
'846': schoolyard
'847': science_laboratory
'848': science_museum
'849': scriptorium
'850': scrubland
'851': scullery
'852': sea_cliff
'853': seaside
'854': seawall
'855': security_check_point
'856': semidesert
'857': server_room
'858': sewer
'859': sewing_room
'860': shed
'861': shelter
'862': shelter_deck
'863': shelter_tent
'864': shipping_room
'865': shipyard_outdoor
'866': shoe_shop
'867': shop
'868': shopfront
'869': shopping_mall_indoor
'870': shopping_mall_outdoor
'871': shore
'872': shower
'873': shower_room
'874': shrine
'875': shrubbery
'876': sidewalk
'877': signal_box
'878': sinkhole
'879': ski_jump
'880': ski_lodge
'881': ski_resort
'882': ski_slope
'883': sky
'884': skyscraper
'885': skywalk_indoor
'886': skywalk_outdoor
'887': slum
'888': snack_bar
'889': snowbank
'890': snowfield
'891': soccer
'892': south_asia
'893': spillway
'894': sporting_goods_store
'895': squash_court
'896': stable
'897': stadium_outdoor
'898': stage_indoor
'899': stage_outdoor
'900': stage_set
'901': staircase
'902': stall
'903': starting_gate
'904': stateroom
'905': station
'906': steam_plant_outdoor
'907': steel_mill_indoor
'908': steel_mill_outdoor
'909': stone_circle
'910': storage_room
'911': store
'912': storm_cellar
'913': street
'914': streetcar_track
'915': strip_mall
'916': strip_mine
'917': student_center
'918': student_residence
'919': study_hall
'920': submarine_interior
'921': subway_interior
'922': sugar_refinery
'923': sun_deck
'924': sunroom
'925': supermarket
'926': supply_chamber
'927': sushi_bar
'928': swamp
'929': swimming_hole
'930': swimming_pool_indoor
'931': swimming_pool_outdoor
'932': synagogue_indoor
'933': synagogue_outdoor
'934': t-bar_lift
'935': tannery
'936': taxistand
'937': taxiway
'938': tea_garden
'939': teahouse
'940': tearoom
'941': teashop
'942': television_room
'943': television_studio
'944': tennis_court_indoor
'945': tennis_court_outdoor
'946': tent_outdoor
'947': terrace_farm
'948': theater_outdoor
'949': threshing_floor
'950': thriftshop
'951': throne_room
'952': ticket_booth
'953': ticket_window_indoor
'954': tidal_basin
'955': tidal_river
'956': tiltyard
'957': tobacco_shop_indoor
'958': toll_plaza
'959': tollbooth
'960': tollgate
'961': tomb
'962': topiary_garden
'963': tower
'964': town_house
'965': toyshop
'966': track_outdoor
'967': tract_housing
'968': trading_floor
'969': traffic_island
'970': trailer_park
'971': train_interior
'972': train_railway
'973': train_station_outdoor
'974': tree_farm
'975': tree_house
'976': trellis
'977': trench
'978': trestle_bridge
'979': truck_stop
'980': tundra
'981': turkish_bath
'982': upper_balcony
'983': urban
'984': utility_room
'985': valley
'986': van_interior
'987': vat
'988': vegetable_garden
'989': vegetation
'990': vehicle
'991': velodrome_indoor
'992': velodrome_outdoor
'993': ventilation_shaft
'994': veranda
'995': vestibule
'996': vestry
'997': veterinarians_office
'998': viaduct
'999': videostore
'1000': village
'1001': vinery
'1002': vineyard
'1003': volcano
'1004': volleyball_court_indoor
'1005': volleyball_court_outdoor
'1006': voting_booth
'1007': waiting_room
'1008': walk_in_freezer
'1009': walkway
'1010': war_room
'1011': warehouse_indoor
'1012': warehouse_outdoor
'1013': washhouse_indoor
'1014': washhouse_outdoor
'1015': washroom
'1016': watchtower
'1017': water
'1018': water_fountain
'1019': water_gate
'1020': water_mill
'1021': water_park
'1022': water_tower
'1023': water_treatment_plant_indoor
'1024': water_treatment_plant_outdoor
'1025': watering_hole
'1026': waterscape
'1027': waterway
'1028': wave
'1029': weighbridge
'1030': western
'1031': wet_bar
'1032': wetland
'1033': wharf
'1034': wheat_field
'1035': whispering_gallery
'1036': widows_walk_indoor
'1037': widows_walk_interior
'1038': wild
'1039': wind_farm
'1040': windmill
'1041': window_seat
'1042': windstorm
'1043': winery
'1044': witness_stand
'1045': woodland
'1046': workroom
'1047': workshop
'1048': wrestling_ring_indoor
'1049': wrestling_ring_outdoor
'1050': yard
'1051': youth_hostel
'1052': zen_garden
'1053': ziggurat
'1054': zoo
splits:
- name: train
num_bytes: 1097055005.51
num_examples: 20210
- name: val
num_bytes: 90418264.0
num_examples: 2000
download_size: 966605341
dataset_size: 1187473269.51
---
# Dataset Card for "ADE20k_Segementation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hndc/AI4Hazard-small | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5781295
num_examples: 2929
- name: test
num_bytes: 737297
num_examples: 367
- name: validation
num_bytes: 735040
num_examples: 366
download_size: 4290748
dataset_size: 7253632
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
liuyanchen1015/MULTI_VALUE_sst2_non_coordinated_obj_subj | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 9155
num_examples: 57
- name: test
num_bytes: 20040
num_examples: 129
- name: train
num_bytes: 256118
num_examples: 1953
download_size: 148728
dataset_size: 285313
---
# Dataset Card for "MULTI_VALUE_sst2_non_coordinated_obj_subj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-launch__gov_report-plain_text-cd8e90-16116213 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP
metrics: ['bertscore']
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: validation
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP
* Dataset: launch/gov_report
* Config: plain_text
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
open-llm-leaderboard/details_macadeliccc__piccolo-math-2x7b | ---
pretty_name: Evaluation run of macadeliccc/piccolo-math-2x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [macadeliccc/piccolo-math-2x7b](https://huggingface.co/macadeliccc/piccolo-math-2x7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__piccolo-math-2x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T21:39:07.430696](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__piccolo-math-2x7b/blob/main/results_2024-01-23T21-39-07.430696.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.64218516330683,\n\
\ \"acc_stderr\": 0.03223781750024571,\n \"acc_norm\": 0.6418148031090513,\n\
\ \"acc_norm_stderr\": 0.03290014345969884,\n \"mc1\": 0.4785801713586291,\n\
\ \"mc1_stderr\": 0.01748743214471181,\n \"mc2\": 0.6385532906891974,\n\
\ \"mc2_stderr\": 0.01575881107075601\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205763,\n\
\ \"acc_norm\": 0.6911262798634812,\n \"acc_norm_stderr\": 0.013501770929344\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7062338179645489,\n\
\ \"acc_stderr\": 0.004545552424153376,\n \"acc_norm\": 0.8727345150368453,\n\
\ \"acc_norm_stderr\": 0.003325890225529858\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.049665709039785295,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.049665709039785295\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997685,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997685\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266868,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266868\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n\
\ \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n\
\ \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n\
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n\
\ \"acc_stderr\": 0.01625113971157077,\n \"acc_norm\": 0.38212290502793295,\n\
\ \"acc_norm_stderr\": 0.01625113971157077\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.025171041915309684,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.025171041915309684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.0127397115540457,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.0127397115540457\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4785801713586291,\n\
\ \"mc1_stderr\": 0.01748743214471181,\n \"mc2\": 0.6385532906891974,\n\
\ \"mc2_stderr\": 0.01575881107075601\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.01126851997157768\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \
\ \"acc_stderr\": 0.012607137125693627\n }\n}\n```"
repo_url: https://huggingface.co/macadeliccc/piccolo-math-2x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|arc:challenge|25_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|gsm8k|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hellaswag|10_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T21-39-07.430696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T21-39-07.430696.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- '**/details_harness|winogrande|5_2024-01-23T21-39-07.430696.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T21-39-07.430696.parquet'
- config_name: results
data_files:
- split: 2024_01_23T21_39_07.430696
path:
- results_2024-01-23T21-39-07.430696.parquet
- split: latest
path:
- results_2024-01-23T21-39-07.430696.parquet
---
# Dataset Card for Evaluation run of macadeliccc/piccolo-math-2x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/piccolo-math-2x7b](https://huggingface.co/macadeliccc/piccolo-math-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__piccolo-math-2x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T21:39:07.430696](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__piccolo-math-2x7b/blob/main/results_2024-01-23T21-39-07.430696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.64218516330683,
"acc_stderr": 0.03223781750024571,
"acc_norm": 0.6418148031090513,
"acc_norm_stderr": 0.03290014345969884,
"mc1": 0.4785801713586291,
"mc1_stderr": 0.01748743214471181,
"mc2": 0.6385532906891974,
"mc2_stderr": 0.01575881107075601
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205763,
"acc_norm": 0.6911262798634812,
"acc_norm_stderr": 0.013501770929344
},
"harness|hellaswag|10": {
"acc": 0.7062338179645489,
"acc_stderr": 0.004545552424153376,
"acc_norm": 0.8727345150368453,
"acc_norm_stderr": 0.003325890225529858
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997685,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997685
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266868,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.01625113971157077,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.01625113971157077
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.0127397115540457,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.0127397115540457
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4785801713586291,
"mc1_stderr": 0.01748743214471181,
"mc2": 0.6385532906891974,
"mc2_stderr": 0.01575881107075601
},
"harness|winogrande|5": {
"acc": 0.7987371744277821,
"acc_stderr": 0.01126851997157768
},
"harness|gsm8k|5": {
"acc": 0.7012888551933283,
"acc_stderr": 0.012607137125693627
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zolak/twitter_dataset_81_1713161412 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 349036
num_examples: 912
download_size: 171323
dataset_size: 349036
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-abstract_algebra-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 13818
num_examples: 100
download_size: 7619
dataset_size: 13818
---
# Dataset Card for "mmlu-abstract_algebra-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
showchen/MakiseKurisu | ---
license: apache-2.0
---
|
victor/autotrain-data-satellite-image-classification | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: satellite-image-classification
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project satellite-image-classification.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<256x256 CMYK PIL image>",
"target": 0
},
{
"image": "<256x256 CMYK PIL image>",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(num_classes=1, names=['cloudy'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1200 |
| valid | 300 |
|
ethz-spylab/competition_eval_dataset | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 3177260
num_examples: 2312
download_size: 1769123
dataset_size: 3177260
---
# Evaluation dataset for the Trojan Competition
This dataset was used to evaluate submissions to the RLHF Trojan Detection competition co-located at IEEE SaTML 2024. For more information, visit the [official competition website](https://github.com/ethz-spylab/rlhf_trojan_competition). |
liuyanchen1015/MULTI_VALUE_qqp_invariant_tag_amnt | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 215577
num_examples: 1209
- name: test
num_bytes: 2515389
num_examples: 13862
- name: train
num_bytes: 1995047
num_examples: 11077
download_size: 2842573
dataset_size: 4726013
---
# Dataset Card for "MULTI_VALUE_qqp_invariant_tag_amnt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kiarash13p/Tallberg | ---
language:
- en
tags:
- geology
- mining
pretty_name: Tallberg
size_categories:
- n<1K
--- |
13GP/training | ---
license: mit
---
|
selimyagci/edos_data | ---
license: unknown
---
|
Cohere/wikipedia-22-12-de-embeddings | ---
annotations_creators:
- expert-generated
language:
- de
multilinguality:
- multilingual
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-retrieval
license:
- apache-2.0
task_ids:
- document-retrieval
---
# Wikipedia (de) embedded with cohere.ai `multilingual-22-12` encoder
We encoded [Wikipedia (de)](https://de.wikipedia.org) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model.
To get an overview how this dataset was created and pre-processed, have a look at [Cohere/wikipedia-22-12](https://huggingface.co/datasets/Cohere/wikipedia-22-12).
## Embeddings
We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/).
## Further languages
We provide embeddings of Wikipedia in many different languages:
[ar](https://huggingface.co/datasets/Cohere/wikipedia-22-12-ar-embeddings), [de](https://huggingface.co/datasets/Cohere/wikipedia-22-12-de-embeddings), [en](https://huggingface.co/datasets/Cohere/wikipedia-22-12-en-embeddings), [es](https://huggingface.co/datasets/Cohere/wikipedia-22-12-es-embeddings), [fr](https://huggingface.co/datasets/Cohere/wikipedia-22-12-fr-embeddings), [hi](https://huggingface.co/datasets/Cohere/wikipedia-22-12-hi-embeddings), [it](https://huggingface.co/datasets/Cohere/wikipedia-22-12-it-embeddings), [ja](https://huggingface.co/datasets/Cohere/wikipedia-22-12-ja-embeddings), [ko](https://huggingface.co/datasets/Cohere/wikipedia-22-12-ko-embeddings), [simple english](https://huggingface.co/datasets/Cohere/wikipedia-22-12-simple-embeddings), [zh](https://huggingface.co/datasets/Cohere/wikipedia-22-12-zh-embeddings),
You can find the Wikipedia datasets without embeddings at [Cohere/wikipedia-22-12](https://huggingface.co/datasets/Cohere/wikipedia-22-12).
## Loading the dataset
You can either load the dataset like this:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/wikipedia-22-12-de-embeddings", split="train")
```
Or you can also stream it without downloading it before:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/wikipedia-22-12-de-embeddings", split="train", streaming=True)
for doc in docs:
docid = doc['id']
title = doc['title']
text = doc['text']
emb = doc['emb']
```
## Search
A full search example:
```python
#Run: pip install cohere datasets
from datasets import load_dataset
import torch
import cohere
co = cohere.Client(f"<<COHERE_API_KEY>>") # Add your cohere API key from www.cohere.com
#Load at max 1000 documents + embeddings
max_docs = 1000
docs_stream = load_dataset(f"Cohere/wikipedia-22-12-de-embeddings", split="train", streaming=True)
docs = []
doc_embeddings = []
for doc in docs_stream:
docs.append(doc)
doc_embeddings.append(doc['emb'])
if len(docs) >= max_docs:
break
doc_embeddings = torch.tensor(doc_embeddings)
query = 'Who founded Youtube'
response = co.embed(texts=[query], model='multilingual-22-12')
query_embedding = response.embeddings
query_embedding = torch.tensor(query_embedding)
# Compute dot score between query embedding and document embeddings
dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1))
top_k = torch.topk(dot_scores, k=3)
# Print results
print("Query:", query)
for doc_id in top_k.indices[0].tolist():
print(docs[doc_id]['title'])
print(docs[doc_id]['text'], "\n")
```
## Performance
You can find performance on the MIRACL dataset (a semantic search evaluation dataset) here: [miracl-en-queries-22-12#performance](https://huggingface.co/datasets/Cohere/miracl-en-queries-22-12#performance) |
adiyghub/openorca-small-1K | ---
license: mit
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1705581.2557290248
num_examples: 1000
download_size: 942136
dataset_size: 1705581.2557290248
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Detail
1000 random examples chosen from OpenOrca |
Boss9xy/ok234 | ---
license: apache-2.0
---
|
Anderson1992/dirceu_rabelo_globo | ---
license: openrail
---
|
zxh4546/msraction3d-2048-24frames | ---
dataset_info:
features:
- name: frame_dir
dtype: string
- name: index
dtype: int64
- name: clip
sequence:
sequence:
sequence: float64
- name: label
dtype: int64
- name: subject_name
dtype: int64
splits:
- name: train
num_bytes: 6163528156
num_examples: 4478
- name: test
num_bytes: 7372009112
num_examples: 5356
download_size: 2522652176
dataset_size: 13535537268
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Shayan01/islamic-data | ---
license: mit
---
|
Xhaheen/dreambooth-hackathon-images-srkman-2 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 4082680.0
num_examples: 20
download_size: 4081453
dataset_size: 4082680.0
---
# Dataset Card for "dreambooth-hackathon-images-srkman-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-logical_fallacies-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 2992
num_examples: 5
download_size: 6735
dataset_size: 2992
---
# Dataset Card for "mmlu-logical_fallacies-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vaishali/spider-tableQA | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: query
dtype: string
- name: question
dtype: string
- name: table_names
sequence: string
- name: tables
sequence: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 2203191673
num_examples: 6715
- name: validation
num_bytes: 434370435
num_examples: 985
download_size: 535322409
dataset_size: 2637562108
task_categories:
- table-question-answering
---
# Dataset Card for "spider-tableQA"
# Usage
```python
import pandas as pd
from datasets import load_dataset
spider_tableQA = load_dataset("vaishali/spider-tableQA")
for sample in spider_tableQA['train']:
question = sample['question']
sql_query = sample['query']
input_table_names = sample["table_names"]
input_tables = [pd.read_json(table, orient='split') for table in sample['tables']]
answer = pd.read_json(sample['answer'], orient='split')
# flattened input/output
input_to_model = sample["source"]
target = sample["target"]
```
# BibTeX entry and citation info
```
@inproceedings{pal-etal-2023-multitabqa,
title = "{M}ulti{T}ab{QA}: Generating Tabular Answers for Multi-Table Question Answering",
author = "Pal, Vaishali and
Yates, Andrew and
Kanoulas, Evangelos and
de Rijke, Maarten",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.348",
doi = "10.18653/v1/2023.acl-long.348",
pages = "6322--6334",
abstract = "Recent advances in tabular question answering (QA) with large language models are constrained in their coverage and only answer questions over a single table. However, real-world queries are complex in nature, often over multiple tables in a relational database or web page. Single table questions do not involve common table operations such as set operations, Cartesian products (joins), or nested queries. Furthermore, multi-table operations often result in a tabular output, which necessitates table generation capabilities of tabular QA models. To fill this gap, we propose a new task of answering questions over multiple tables. Our model, MultiTabQA, not only answers questions over multiple tables, but also generalizes to generate tabular answers. To enable effective training, we build a pre-training dataset comprising of 132,645 SQL queries and tabular answers. Further, we evaluate the generated tables by introducing table-specific metrics of varying strictness assessing various levels of granularity of the table structure. MultiTabQA outperforms state-of-the-art single table QA models adapted to a multi-table QA setting by finetuning on three datasets: Spider, Atis and GeoQuery.",
}
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SumomoLee/tmp | ---
license: apache-2.0
---
|
autoevaluate/autoeval-staging-eval-project-squad-6abc415f-12465657 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad
eval_info:
task: extractive_question_answering
model: deepset/deberta-v3-large-squad2
metrics: []
dataset_name: squad
dataset_config: plain_text
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: deepset/deberta-v3-large-squad2
* Dataset: squad
* Config: plain_text
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sjrlee](https://huggingface.co/sjrlee) for evaluating this model. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.