datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_JoSw-14__LoKuS-13B | ---
pretty_name: Evaluation run of JoSw-14/LoKuS-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JoSw-14/LoKuS-13B](https://huggingface.co/JoSw-14/LoKuS-13B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JoSw-14__LoKuS-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T14:44:55.625857](https://huggingface.co/datasets/open-llm-leaderboard/details_JoSw-14__LoKuS-13B/blob/main/results_2023-08-30T14%3A44%3A55.625857.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7392017775637323,\n\
\ \"acc_stderr\": 0.02998325172953868,\n \"acc_norm\": 0.7430126648780258,\n\
\ \"acc_norm_stderr\": 0.02996682963285512,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5139290818009655,\n\
\ \"mc2_stderr\": 0.015442752112557084\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5418088737201365,\n \"acc_stderr\": 0.0145602203087147,\n\
\ \"acc_norm\": 0.5776450511945392,\n \"acc_norm_stderr\": 0.014434138713379986\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6050587532364071,\n\
\ \"acc_stderr\": 0.004878390226591711,\n \"acc_norm\": 0.7940649273053176,\n\
\ \"acc_norm_stderr\": 0.004035568117596522\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7697368421052632,\n \"acc_stderr\": 0.034260594244031654,\n\
\ \"acc_norm\": 0.7697368421052632,\n \"acc_norm_stderr\": 0.034260594244031654\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8377358490566038,\n \"acc_stderr\": 0.022691482872035377,\n\
\ \"acc_norm\": 0.8377358490566038,\n \"acc_norm_stderr\": 0.022691482872035377\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n\
\ \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.02977164271249123,\n\
\ \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.02977164271249123\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7172413793103448,\n \"acc_stderr\": 0.03752833958003336,\n\
\ \"acc_norm\": 0.7172413793103448,\n \"acc_norm_stderr\": 0.03752833958003336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5079365079365079,\n \"acc_stderr\": 0.02574806587167329,\n \"\
acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.02574806587167329\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"\
acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6551724137931034,\n \"acc_stderr\": 0.033442837442804574,\n \"\
acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.033442837442804574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.025485498373343237,\n\
\ \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.025485498373343237\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.02193804773885313,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.02193804773885313\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.0209868545932897,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.0209868545932897\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7153846153846154,\n \"acc_stderr\": 0.0228783227997063,\n \
\ \"acc_norm\": 0.7153846153846154,\n \"acc_norm_stderr\": 0.0228783227997063\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.45185185185185184,\n \"acc_stderr\": 0.030343862998512626,\n \
\ \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.030343862998512626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.02585916412205146,\n \
\ \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.02585916412205146\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8954128440366973,\n \"acc_stderr\": 0.013120530245265594,\n \"\
acc_norm\": 0.8954128440366973,\n \"acc_norm_stderr\": 0.013120530245265594\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.032149521478027486,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.032149521478027486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089674,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089674\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878453,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878453\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744633,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744633\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807195,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807195\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.033432700628696195,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.033432700628696195\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8895705521472392,\n \"acc_stderr\": 0.024624937788941318,\n\
\ \"acc_norm\": 0.8895705521472392,\n \"acc_norm_stderr\": 0.024624937788941318\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n\
\ \"acc_stderr\": 0.04572372358737432,\n \"acc_norm\": 0.6339285714285714,\n\
\ \"acc_norm_stderr\": 0.04572372358737432\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n\
\ \"acc_stderr\": 0.016534627684311368,\n \"acc_norm\": 0.9316239316239316,\n\
\ \"acc_norm_stderr\": 0.016534627684311368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.876117496807152,\n\
\ \"acc_stderr\": 0.011781017100950735,\n \"acc_norm\": 0.876117496807152,\n\
\ \"acc_norm_stderr\": 0.011781017100950735\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7890173410404624,\n \"acc_stderr\": 0.021966309947043107,\n\
\ \"acc_norm\": 0.7890173410404624,\n \"acc_norm_stderr\": 0.021966309947043107\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6212290502793296,\n\
\ \"acc_stderr\": 0.01622353351036513,\n \"acc_norm\": 0.6212290502793296,\n\
\ \"acc_norm_stderr\": 0.01622353351036513\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8071895424836601,\n \"acc_stderr\": 0.022589318888176696,\n\
\ \"acc_norm\": 0.8071895424836601,\n \"acc_norm_stderr\": 0.022589318888176696\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n\
\ \"acc_stderr\": 0.022122439772480785,\n \"acc_norm\": 0.8135048231511254,\n\
\ \"acc_norm_stderr\": 0.022122439772480785\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.020888690414093882,\n\
\ \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.020888690414093882\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6205673758865248,\n \"acc_stderr\": 0.0289473388516141,\n \
\ \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.0289473388516141\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6375488917861799,\n\
\ \"acc_stderr\": 0.012277512533252499,\n \"acc_norm\": 0.6375488917861799,\n\
\ \"acc_norm_stderr\": 0.012277512533252499\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.023886881922440335,\n\
\ \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.023886881922440335\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7875816993464052,\n \"acc_stderr\": 0.016547148636203147,\n \
\ \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.016547148636203147\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8090909090909091,\n\
\ \"acc_stderr\": 0.03764425585984927,\n \"acc_norm\": 0.8090909090909091,\n\
\ \"acc_norm_stderr\": 0.03764425585984927\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960227,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960227\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n\
\ \"acc_stderr\": 0.02068718695153409,\n \"acc_norm\": 0.9054726368159204,\n\
\ \"acc_norm_stderr\": 0.02068718695153409\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466115,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466115\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.7469879518072289,\n\
\ \"acc_stderr\": 0.033844291552331464,\n \"acc_norm\": 0.7469879518072289,\n\
\ \"acc_norm_stderr\": 0.033844291552331464\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9064327485380117,\n \"acc_stderr\": 0.02233599323116327,\n\
\ \"acc_norm\": 0.9064327485380117,\n \"acc_norm_stderr\": 0.02233599323116327\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5139290818009655,\n\
\ \"mc2_stderr\": 0.015442752112557084\n }\n}\n```"
repo_url: https://huggingface.co/JoSw-14/LoKuS-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|arc:challenge|25_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hellaswag|10_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T14:44:55.625857.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T14:44:55.625857.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T14:44:55.625857.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T14:44:55.625857.parquet'
- config_name: results
data_files:
- split: 2023_08_30T14_44_55.625857
path:
- results_2023-08-30T14:44:55.625857.parquet
- split: latest
path:
- results_2023-08-30T14:44:55.625857.parquet
---
# Dataset Card for Evaluation run of JoSw-14/LoKuS-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/JoSw-14/LoKuS-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [JoSw-14/LoKuS-13B](https://huggingface.co/JoSw-14/LoKuS-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JoSw-14__LoKuS-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T14:44:55.625857](https://huggingface.co/datasets/open-llm-leaderboard/details_JoSw-14__LoKuS-13B/blob/main/results_2023-08-30T14%3A44%3A55.625857.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7392017775637323,
"acc_stderr": 0.02998325172953868,
"acc_norm": 0.7430126648780258,
"acc_norm_stderr": 0.02996682963285512,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5139290818009655,
"mc2_stderr": 0.015442752112557084
},
"harness|arc:challenge|25": {
"acc": 0.5418088737201365,
"acc_stderr": 0.0145602203087147,
"acc_norm": 0.5776450511945392,
"acc_norm_stderr": 0.014434138713379986
},
"harness|hellaswag|10": {
"acc": 0.6050587532364071,
"acc_stderr": 0.004878390226591711,
"acc_norm": 0.7940649273053176,
"acc_norm_stderr": 0.004035568117596522
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7697368421052632,
"acc_stderr": 0.034260594244031654,
"acc_norm": 0.7697368421052632,
"acc_norm_stderr": 0.034260594244031654
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8377358490566038,
"acc_stderr": 0.022691482872035377,
"acc_norm": 0.8377358490566038,
"acc_norm_stderr": 0.022691482872035377
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7063829787234043,
"acc_stderr": 0.02977164271249123,
"acc_norm": 0.7063829787234043,
"acc_norm_stderr": 0.02977164271249123
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7172413793103448,
"acc_stderr": 0.03752833958003336,
"acc_norm": 0.7172413793103448,
"acc_norm_stderr": 0.03752833958003336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.02574806587167329,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.02574806587167329
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.033442837442804574,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.033442837442804574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.025485498373343237,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.025485498373343237
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.02193804773885313,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.02193804773885313
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.0209868545932897,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.0209868545932897
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7153846153846154,
"acc_stderr": 0.0228783227997063,
"acc_norm": 0.7153846153846154,
"acc_norm_stderr": 0.0228783227997063
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.030343862998512626,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.030343862998512626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.02585916412205146,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.02585916412205146
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8954128440366973,
"acc_stderr": 0.013120530245265594,
"acc_norm": 0.8954128440366973,
"acc_norm_stderr": 0.013120530245265594
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.032149521478027486,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.032149521478027486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089674,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089674
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878453,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878453
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744633,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744633
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807195,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807195
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.033432700628696195,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.033432700628696195
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8895705521472392,
"acc_stderr": 0.024624937788941318,
"acc_norm": 0.8895705521472392,
"acc_norm_stderr": 0.024624937788941318
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737432,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737432
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311368,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.876117496807152,
"acc_stderr": 0.011781017100950735,
"acc_norm": 0.876117496807152,
"acc_norm_stderr": 0.011781017100950735
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7890173410404624,
"acc_stderr": 0.021966309947043107,
"acc_norm": 0.7890173410404624,
"acc_norm_stderr": 0.021966309947043107
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6212290502793296,
"acc_stderr": 0.01622353351036513,
"acc_norm": 0.6212290502793296,
"acc_norm_stderr": 0.01622353351036513
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8071895424836601,
"acc_stderr": 0.022589318888176696,
"acc_norm": 0.8071895424836601,
"acc_norm_stderr": 0.022589318888176696
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8135048231511254,
"acc_stderr": 0.022122439772480785,
"acc_norm": 0.8135048231511254,
"acc_norm_stderr": 0.022122439772480785
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8302469135802469,
"acc_stderr": 0.020888690414093882,
"acc_norm": 0.8302469135802469,
"acc_norm_stderr": 0.020888690414093882
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6205673758865248,
"acc_stderr": 0.0289473388516141,
"acc_norm": 0.6205673758865248,
"acc_norm_stderr": 0.0289473388516141
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6375488917861799,
"acc_stderr": 0.012277512533252499,
"acc_norm": 0.6375488917861799,
"acc_norm_stderr": 0.012277512533252499
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.023886881922440335,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.023886881922440335
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.016547148636203147,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.016547148636203147
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.8090909090909091,
"acc_stderr": 0.03764425585984927,
"acc_norm": 0.8090909090909091,
"acc_norm_stderr": 0.03764425585984927
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960227,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960227
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.02068718695153409,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.02068718695153409
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466115,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466115
},
"harness|hendrycksTest-virology|5": {
"acc": 0.7469879518072289,
"acc_stderr": 0.033844291552331464,
"acc_norm": 0.7469879518072289,
"acc_norm_stderr": 0.033844291552331464
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9064327485380117,
"acc_stderr": 0.02233599323116327,
"acc_norm": 0.9064327485380117,
"acc_norm_stderr": 0.02233599323116327
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5139290818009655,
"mc2_stderr": 0.015442752112557084
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
vraman54/HealthInsuranceDataset | ---
license: apache-2.0
---
|
AIRI-NLP/quality_counter_new_4608 | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 553222362
num_examples: 20000
- name: validation
num_bytes: 222944090
num_examples: 8000
- name: test
num_bytes: 56238328
num_examples: 2300
download_size: 26388335
dataset_size: 832404780
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf | ---
pretty_name: Evaluation run of elliotthwang/elliott_Llama-2-7b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elliotthwang/elliott_Llama-2-7b-hf](https://huggingface.co/elliotthwang/elliott_Llama-2-7b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T23:30:10.386120](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf/blob/main/results_2023-10-26T23-30-10.386120.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.00034761798968571027,\n \"f1\": 0.05575817953020141,\n\
\ \"f1_stderr\": 0.001306153544964195,\n \"acc\": 0.4026884110741377,\n\
\ \"acc_stderr\": 0.009681922567248534\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968571027,\n\
\ \"f1\": 0.05575817953020141,\n \"f1_stderr\": 0.001306153544964195\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06899166034874905,\n \
\ \"acc_stderr\": 0.006980995834838602\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7363851617995264,\n \"acc_stderr\": 0.012382849299658464\n\
\ }\n}\n```"
repo_url: https://huggingface.co/elliotthwang/elliott_Llama-2-7b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|arc:challenge|25_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T23_30_10.386120
path:
- '**/details_harness|drop|3_2023-10-26T23-30-10.386120.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T23-30-10.386120.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T23_30_10.386120
path:
- '**/details_harness|gsm8k|5_2023-10-26T23-30-10.386120.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T23-30-10.386120.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hellaswag|10_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T04-04-19.372525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T04-04-19.372525.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T04-04-19.372525.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T23_30_10.386120
path:
- '**/details_harness|winogrande|5_2023-10-26T23-30-10.386120.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T23-30-10.386120.parquet'
- config_name: results
data_files:
- split: 2023_10_09T04_04_19.372525
path:
- results_2023-10-09T04-04-19.372525.parquet
- split: 2023_10_26T23_30_10.386120
path:
- results_2023-10-26T23-30-10.386120.parquet
- split: latest
path:
- results_2023-10-26T23-30-10.386120.parquet
---
# Dataset Card for Evaluation run of elliotthwang/elliott_Llama-2-7b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elliotthwang/elliott_Llama-2-7b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elliotthwang/elliott_Llama-2-7b-hf](https://huggingface.co/elliotthwang/elliott_Llama-2-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T23:30:10.386120](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf/blob/main/results_2023-10-26T23-30-10.386120.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.00034761798968571027,
"f1": 0.05575817953020141,
"f1_stderr": 0.001306153544964195,
"acc": 0.4026884110741377,
"acc_stderr": 0.009681922567248534
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.00034761798968571027,
"f1": 0.05575817953020141,
"f1_stderr": 0.001306153544964195
},
"harness|gsm8k|5": {
"acc": 0.06899166034874905,
"acc_stderr": 0.006980995834838602
},
"harness|winogrande|5": {
"acc": 0.7363851617995264,
"acc_stderr": 0.012382849299658464
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_itsliupeng__Mixtral-8x7B-v0.1-top3 | ---
pretty_name: Evaluation run of itsliupeng/Mixtral-8x7B-v0.1-top3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [itsliupeng/Mixtral-8x7B-v0.1-top3](https://huggingface.co/itsliupeng/Mixtral-8x7B-v0.1-top3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__Mixtral-8x7B-v0.1-top3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-24T20:17:11.492534](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__Mixtral-8x7B-v0.1-top3/blob/main/results_2023-12-24T20-17-11.492534.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7170056917230552,\n\
\ \"acc_stderr\": 0.03001643545450993,\n \"acc_norm\": 0.721542301890713,\n\
\ \"acc_norm_stderr\": 0.030593708007455218,\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.016371836286454607,\n \"mc2\": 0.48575008545428044,\n\
\ \"mc2_stderr\": 0.014261044394633108\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882419,\n\
\ \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693244\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6715793666600279,\n\
\ \"acc_stderr\": 0.004686789042445367,\n \"acc_norm\": 0.8662617008564031,\n\
\ \"acc_norm_stderr\": 0.0033967530276634164\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\
\ \"acc_stderr\": 0.03885004245800254,\n \"acc_norm\": 0.7185185185185186,\n\
\ \"acc_norm_stderr\": 0.03885004245800254\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7735849056603774,\n \"acc_stderr\": 0.025757559893106734,\n\
\ \"acc_norm\": 0.7735849056603774,\n \"acc_norm_stderr\": 0.025757559893106734\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8541666666666666,\n\
\ \"acc_stderr\": 0.029514245964291766,\n \"acc_norm\": 0.8541666666666666,\n\
\ \"acc_norm_stderr\": 0.029514245964291766\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7225433526011561,\n\
\ \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.7225433526011561,\n\
\ \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.049665709039785295,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.049665709039785295\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n\
\ \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7191489361702128,\n \"acc_stderr\": 0.029379170464124825,\n\
\ \"acc_norm\": 0.7191489361702128,\n \"acc_norm_stderr\": 0.029379170464124825\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.631578947368421,\n\
\ \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.631578947368421,\n\
\ \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4973544973544973,\n \"acc_stderr\": 0.025750949678130387,\n \"\
acc_norm\": 0.4973544973544973,\n \"acc_norm_stderr\": 0.025750949678130387\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5873015873015873,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.5873015873015873,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8548387096774194,\n\
\ \"acc_stderr\": 0.020039563628053286,\n \"acc_norm\": 0.8548387096774194,\n\
\ \"acc_norm_stderr\": 0.020039563628053286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.625615763546798,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.625615763546798,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562094,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562094\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7230769230769231,\n \"acc_stderr\": 0.022688042352424994,\n\
\ \"acc_norm\": 0.7230769230769231,\n \"acc_norm_stderr\": 0.022688042352424994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.02626502460827588,\n \
\ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.02626502460827588\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8917431192660551,\n \"acc_stderr\": 0.013321348447611743,\n \"\
acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.013321348447611743\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6203703703703703,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \
\ \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n\
\ \"acc_stderr\": 0.028188240046929203,\n \"acc_norm\": 0.7713004484304933,\n\
\ \"acc_norm_stderr\": 0.028188240046929203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494732,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494732\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.018315891685625856,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.018315891685625856\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\
\ \"acc_stderr\": 0.011832954239305742,\n \"acc_norm\": 0.8748403575989783,\n\
\ \"acc_norm_stderr\": 0.011832954239305742\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252552,\n\
\ \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252552\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.016513676031179595,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.016513676031179595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.022733789405447593,\n\
\ \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.022733789405447593\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8580246913580247,\n \"acc_stderr\": 0.01942026010943829,\n\
\ \"acc_norm\": 0.8580246913580247,\n \"acc_norm_stderr\": 0.01942026010943829\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5404172099087353,\n\
\ \"acc_stderr\": 0.012728446067669952,\n \"acc_norm\": 0.5404172099087353,\n\
\ \"acc_norm_stderr\": 0.012728446067669952\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8051470588235294,\n \"acc_stderr\": 0.02406059942348742,\n\
\ \"acc_norm\": 0.8051470588235294,\n \"acc_norm_stderr\": 0.02406059942348742\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7973856209150327,\n \"acc_stderr\": 0.01626105528374613,\n \
\ \"acc_norm\": 0.7973856209150327,\n \"acc_norm_stderr\": 0.01626105528374613\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904014,\n\
\ \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904014\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646612,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646612\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.016371836286454607,\n \"mc2\": 0.48575008545428044,\n\
\ \"mc2_stderr\": 0.014261044394633108\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320708\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5754359363153905,\n \
\ \"acc_stderr\": 0.013614835574956375\n }\n}\n```"
repo_url: https://huggingface.co/itsliupeng/Mixtral-8x7B-v0.1-top3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|arc:challenge|25_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|gsm8k|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hellaswag|10_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T20-17-11.492534.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-24T20-17-11.492534.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- '**/details_harness|winogrande|5_2023-12-24T20-17-11.492534.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-24T20-17-11.492534.parquet'
- config_name: results
data_files:
- split: 2023_12_24T20_17_11.492534
path:
- results_2023-12-24T20-17-11.492534.parquet
- split: latest
path:
- results_2023-12-24T20-17-11.492534.parquet
---
# Dataset Card for Evaluation run of itsliupeng/Mixtral-8x7B-v0.1-top3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [itsliupeng/Mixtral-8x7B-v0.1-top3](https://huggingface.co/itsliupeng/Mixtral-8x7B-v0.1-top3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_itsliupeng__Mixtral-8x7B-v0.1-top3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T20:17:11.492534](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__Mixtral-8x7B-v0.1-top3/blob/main/results_2023-12-24T20-17-11.492534.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7170056917230552,
"acc_stderr": 0.03001643545450993,
"acc_norm": 0.721542301890713,
"acc_norm_stderr": 0.030593708007455218,
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454607,
"mc2": 0.48575008545428044,
"mc2_stderr": 0.014261044394633108
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882419,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693244
},
"harness|hellaswag|10": {
"acc": 0.6715793666600279,
"acc_stderr": 0.004686789042445367,
"acc_norm": 0.8662617008564031,
"acc_norm_stderr": 0.0033967530276634164
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.03885004245800254,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.03885004245800254
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7735849056603774,
"acc_stderr": 0.025757559893106734,
"acc_norm": 0.7735849056603774,
"acc_norm_stderr": 0.025757559893106734
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8541666666666666,
"acc_stderr": 0.029514245964291766,
"acc_norm": 0.8541666666666666,
"acc_norm_stderr": 0.029514245964291766
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7191489361702128,
"acc_stderr": 0.029379170464124825,
"acc_norm": 0.7191489361702128,
"acc_norm_stderr": 0.029379170464124825
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4973544973544973,
"acc_stderr": 0.025750949678130387,
"acc_norm": 0.4973544973544973,
"acc_norm_stderr": 0.025750949678130387
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5873015873015873,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.5873015873015873,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8548387096774194,
"acc_stderr": 0.020039563628053286,
"acc_norm": 0.8548387096774194,
"acc_norm_stderr": 0.020039563628053286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.625615763546798,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.625615763546798,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562094,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562094
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880232,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880232
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7230769230769231,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.7230769230769231,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.02950286112895529,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.02950286112895529
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.02626502460827588,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.02626502460827588
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.013321348447611743,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.013321348447611743
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7713004484304933,
"acc_stderr": 0.028188240046929203,
"acc_norm": 0.7713004484304933,
"acc_norm_stderr": 0.028188240046929203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494732,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494732
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002158,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002158
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625856,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625856
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.011832954239305742,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.011832954239305742
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8005780346820809,
"acc_stderr": 0.021511900654252552,
"acc_norm": 0.8005780346820809,
"acc_norm_stderr": 0.021511900654252552
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.016513676031179595,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.016513676031179595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.022733789405447593,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.022733789405447593
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8580246913580247,
"acc_stderr": 0.01942026010943829,
"acc_norm": 0.8580246913580247,
"acc_norm_stderr": 0.01942026010943829
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5390070921985816,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.5390070921985816,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5404172099087353,
"acc_stderr": 0.012728446067669952,
"acc_norm": 0.5404172099087353,
"acc_norm_stderr": 0.012728446067669952
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8051470588235294,
"acc_stderr": 0.02406059942348742,
"acc_norm": 0.8051470588235294,
"acc_norm_stderr": 0.02406059942348742
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7973856209150327,
"acc_stderr": 0.01626105528374613,
"acc_norm": 0.7973856209150327,
"acc_norm_stderr": 0.01626105528374613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904014,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904014
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646612,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646612
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454607,
"mc2": 0.48575008545428044,
"mc2_stderr": 0.014261044394633108
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320708
},
"harness|gsm8k|5": {
"acc": 0.5754359363153905,
"acc_stderr": 0.013614835574956375
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_stsb_yall | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 29645
num_examples: 143
- name: test
num_bytes: 18259
num_examples: 140
- name: train
num_bytes: 13702
num_examples: 97
download_size: 44125
dataset_size: 61606
---
# Dataset Card for "MULTI_VALUE_stsb_yall"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Back-up/so_tay_sv_test | ---
dataset_info:
features:
- name: Questions
dtype: string
- name: Answers
dtype: string
splits:
- name: train
num_bytes: 26492.31818181818
num_examples: 36
download_size: 20842
dataset_size: 26492.31818181818
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "so_tay_sv_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Devio__test-9k-fn | ---
pretty_name: Evaluation run of Devio/test-9k-fn
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Devio/test-9k-fn](https://huggingface.co/Devio/test-9k-fn) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Devio__test-9k-fn\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T07:45:21.870360](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__test-9k-fn/blob/main/results_2023-10-04T07-45-21.870360.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.29934767850730726,\n\
\ \"acc_stderr\": 0.033158996935405735,\n \"acc_norm\": 0.3034282752932227,\n\
\ \"acc_norm_stderr\": 0.03315857474708369,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.3914546223201993,\n\
\ \"mc2_stderr\": 0.013969580332280395\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35665529010238906,\n \"acc_stderr\": 0.013998056902620192,\n\
\ \"acc_norm\": 0.4087030716723549,\n \"acc_norm_stderr\": 0.014365750345427008\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5057757418840868,\n\
\ \"acc_stderr\": 0.004989448490164429,\n \"acc_norm\": 0.6944831706831308,\n\
\ \"acc_norm_stderr\": 0.004596845936356623\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438655,\n\
\ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.2986111111111111,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n\
\ \"acc_stderr\": 0.03414014007044037,\n \"acc_norm\": 0.2774566473988439,\n\
\ \"acc_norm_stderr\": 0.03414014007044037\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610337,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n\
\ \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n\
\ \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378947,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378947\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.22486772486772486,\n \"acc_stderr\": 0.02150209607822914,\n \"\
acc_norm\": 0.22486772486772486,\n \"acc_norm_stderr\": 0.02150209607822914\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2709677419354839,\n\
\ \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.2709677419354839,\n\
\ \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.18719211822660098,\n \"acc_stderr\": 0.027444924966882618,\n\
\ \"acc_norm\": 0.18719211822660098,\n \"acc_norm_stderr\": 0.027444924966882618\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3181818181818182,\n \"acc_stderr\": 0.03318477333845331,\n \"\
acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.03318477333845331\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.03458816042181005,\n\
\ \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.03458816042181005\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.024503472557110936,\n\
\ \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.024503472557110936\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.029213549414372163,\n\
\ \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.029213549414372163\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3211009174311927,\n \"acc_stderr\": 0.020018149772733747,\n \"\
acc_norm\": 0.3211009174311927,\n \"acc_norm_stderr\": 0.020018149772733747\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.032757734861009996,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.032757734861009996\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.030685820596610795,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.030685820596610795\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n\
\ \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.34080717488789236,\n\
\ \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3282442748091603,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.3282442748091603,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.041858325989283136,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.041858325989283136\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3974358974358974,\n\
\ \"acc_stderr\": 0.032059534537892925,\n \"acc_norm\": 0.3974358974358974,\n\
\ \"acc_norm_stderr\": 0.032059534537892925\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n\
\ \"acc_stderr\": 0.015594955384455772,\n \"acc_norm\": 0.2554278416347382,\n\
\ \"acc_norm_stderr\": 0.015594955384455772\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574877,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574877\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.02795604616542451,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.02795604616542451\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2604501607717042,\n\
\ \"acc_stderr\": 0.02492672322484555,\n \"acc_norm\": 0.2604501607717042,\n\
\ \"acc_norm_stderr\": 0.02492672322484555\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140242,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140242\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2666232073011734,\n\
\ \"acc_stderr\": 0.011293836031612131,\n \"acc_norm\": 0.2666232073011734,\n\
\ \"acc_norm_stderr\": 0.011293836031612131\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.31985294117647056,\n \"acc_stderr\": 0.02833295951403124,\n\
\ \"acc_norm\": 0.31985294117647056,\n \"acc_norm_stderr\": 0.02833295951403124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3306122448979592,\n \"acc_stderr\": 0.030116426296540592,\n\
\ \"acc_norm\": 0.3306122448979592,\n \"acc_norm_stderr\": 0.030116426296540592\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.31840796019900497,\n\
\ \"acc_stderr\": 0.03294118479054096,\n \"acc_norm\": 0.31840796019900497,\n\
\ \"acc_norm_stderr\": 0.03294118479054096\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.3914546223201993,\n\
\ \"mc2_stderr\": 0.013969580332280395\n }\n}\n```"
repo_url: https://huggingface.co/Devio/test-9k-fn
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|arc:challenge|25_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hellaswag|10_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-45-21.870360.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-45-21.870360.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T07-45-21.870360.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T07-45-21.870360.parquet'
- config_name: results
data_files:
- split: 2023_10_04T07_45_21.870360
path:
- results_2023-10-04T07-45-21.870360.parquet
- split: latest
path:
- results_2023-10-04T07-45-21.870360.parquet
---
# Dataset Card for Evaluation run of Devio/test-9k-fn
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Devio/test-9k-fn
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Devio/test-9k-fn](https://huggingface.co/Devio/test-9k-fn) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Devio__test-9k-fn",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T07:45:21.870360](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__test-9k-fn/blob/main/results_2023-10-04T07-45-21.870360.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.29934767850730726,
"acc_stderr": 0.033158996935405735,
"acc_norm": 0.3034282752932227,
"acc_norm_stderr": 0.03315857474708369,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.3914546223201993,
"mc2_stderr": 0.013969580332280395
},
"harness|arc:challenge|25": {
"acc": 0.35665529010238906,
"acc_stderr": 0.013998056902620192,
"acc_norm": 0.4087030716723549,
"acc_norm_stderr": 0.014365750345427008
},
"harness|hellaswag|10": {
"acc": 0.5057757418840868,
"acc_stderr": 0.004989448490164429,
"acc_norm": 0.6944831706831308,
"acc_norm_stderr": 0.004596845936356623
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.0391545063041425,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.0391545063041425
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438655,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.03414014007044037,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.03414014007044037
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378947,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378947
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.22486772486772486,
"acc_stderr": 0.02150209607822914,
"acc_norm": 0.22486772486772486,
"acc_norm_stderr": 0.02150209607822914
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18719211822660098,
"acc_stderr": 0.027444924966882618,
"acc_norm": 0.18719211822660098,
"acc_norm_stderr": 0.027444924966882618
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35751295336787564,
"acc_stderr": 0.03458816042181005,
"acc_norm": 0.35751295336787564,
"acc_norm_stderr": 0.03458816042181005
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3717948717948718,
"acc_stderr": 0.024503472557110936,
"acc_norm": 0.3717948717948718,
"acc_norm_stderr": 0.024503472557110936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2815126050420168,
"acc_stderr": 0.029213549414372163,
"acc_norm": 0.2815126050420168,
"acc_norm_stderr": 0.029213549414372163
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3211009174311927,
"acc_stderr": 0.020018149772733747,
"acc_norm": 0.3211009174311927,
"acc_norm_stderr": 0.020018149772733747
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.030685820596610795,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.030685820596610795
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.031811497470553604,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.031811497470553604
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3282442748091603,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.3282442748091603,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3974358974358974,
"acc_stderr": 0.032059534537892925,
"acc_norm": 0.3974358974358974,
"acc_norm_stderr": 0.032059534537892925
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455772,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455772
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574877,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.02795604616542451,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.02795604616542451
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2604501607717042,
"acc_stderr": 0.02492672322484555,
"acc_norm": 0.2604501607717042,
"acc_norm_stderr": 0.02492672322484555
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.026789172351140242,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.026789172351140242
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2666232073011734,
"acc_stderr": 0.011293836031612131,
"acc_norm": 0.2666232073011734,
"acc_norm_stderr": 0.011293836031612131
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.31985294117647056,
"acc_stderr": 0.02833295951403124,
"acc_norm": 0.31985294117647056,
"acc_norm_stderr": 0.02833295951403124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3306122448979592,
"acc_stderr": 0.030116426296540592,
"acc_norm": 0.3306122448979592,
"acc_norm_stderr": 0.030116426296540592
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.31840796019900497,
"acc_stderr": 0.03294118479054096,
"acc_norm": 0.31840796019900497,
"acc_norm_stderr": 0.03294118479054096
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.3914546223201993,
"mc2_stderr": 0.013969580332280395
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zhili1990/KS_Ashare_announce_NER | ---
dataset_info:
features:
- name: text
dtype: string
- name: entity_list
list:
- name: type
sequence: string
- name: argument
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: kstext
dtype: string
splits:
- name: train
num_bytes: 6730713
num_examples: 9277
download_size: 2177402
dataset_size: 6730713
---
# Dataset Card for "KS_Ashare_announce_NER"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713199355 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2415483
num_examples: 7226
download_size: 1381962
dataset_size: 2415483
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/prince_of_wales_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of prince_of_wales/プリンス・オブ・ウェールズ/威尔士亲王 (Azur Lane)
This is the dataset of prince_of_wales/プリンス・オブ・ウェールズ/威尔士亲王 (Azur Lane), containing 241 images and their tags.
The core tags of this character are `blonde_hair, red_eyes, breasts, braid, short_hair, large_breasts, bangs, earrings, crown_braid, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 241 | 315.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prince_of_wales_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 241 | 180.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prince_of_wales_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 549 | 363.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prince_of_wales_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 241 | 277.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prince_of_wales_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 549 | 513.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prince_of_wales_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/prince_of_wales_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, solo, pleated_skirt, jewelry, white_thighhighs, black_skirt, french_braid, holding_sword, white_gloves, zettai_ryouiki, looking_at_viewer, simple_background, white_background, red_cape |
| 1 | 6 |  |  |  |  |  | 1girl, blue_sky, day, looking_at_viewer, outdoors, solo, white_bikini, bare_shoulders, blush, closed_mouth, cloud, hair_flower, smile, swept_bangs, cleavage, collarbone, covered_nipples, navel, ocean, red_choker, beach, bracelet, mismatched_bikini, water |
| 2 | 8 |  |  |  |  |  | 1girl, hair_flower, looking_at_viewer, solo, swept_bangs, white_bikini, cleavage, bracelet, navel, simple_background, smile, bare_shoulders, collarbone, red_choker, bikini_top_only, closed_mouth, sunglasses, white_background, blush, french_braid, medium_hair, sarong |
| 3 | 5 |  |  |  |  |  | 1girl, bare_shoulders, solo, closed_mouth, covered_nipples, hair_flower, holding_eyewear, looking_at_viewer, looking_back, sideboob, simple_background, smile, sunglasses, white_background, white_bikini, cowboy_shot, eyewear_removed, from_behind, red_bikini, red_choker, swept_bangs, bracelet, butt_crack, mismatched_bikini, official_alternate_costume |
| 4 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hair_flower, hetero, paizuri, penis, solo_focus, choker, bikini, bracelet, collarbone, huge_breasts, looking_at_viewer, swept_bangs, bar_censor, breasts_squeezed_together, closed_mouth, cum, long_hair, nipples, sweat |
| 5 | 14 |  |  |  |  |  | bare_shoulders, looking_at_viewer, race_queen, 1girl, cleavage, solo, black_gloves, criss-cross_halter, dress, elbow_gloves, smile, white_thighhighs, covered_navel, holding, parted_lips, checkered_flag, detached_sleeves, long_hair, skindentation, swept_bangs, blush, official_alternate_costume, ponytail, sitting |
| 6 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, nipples, sweat, armpits, arms_up, solo, spread_legs, anus, arms_behind_head, completely_nude, female_pubic_hair, jewelry, mosaic_censoring, open_mouth, anal_hair, armpit_hair, ass, beach, blurry_background, boots, collarbone, french_braid, gaping, medium_hair, mole_on_breast, outdoors, palm_tree, pussy_juice, red_footwear, sky, smile, spread_pussy, squatting, teeth, thighs, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | pleated_skirt | jewelry | white_thighhighs | black_skirt | french_braid | holding_sword | white_gloves | zettai_ryouiki | looking_at_viewer | simple_background | white_background | red_cape | blue_sky | day | outdoors | white_bikini | bare_shoulders | blush | closed_mouth | cloud | hair_flower | smile | swept_bangs | cleavage | collarbone | covered_nipples | navel | ocean | red_choker | beach | bracelet | mismatched_bikini | water | bikini_top_only | sunglasses | medium_hair | sarong | holding_eyewear | looking_back | sideboob | cowboy_shot | eyewear_removed | from_behind | red_bikini | butt_crack | official_alternate_costume | 1boy | hetero | paizuri | penis | solo_focus | choker | bikini | huge_breasts | bar_censor | breasts_squeezed_together | cum | long_hair | nipples | sweat | race_queen | black_gloves | criss-cross_halter | dress | elbow_gloves | covered_navel | holding | parted_lips | checkered_flag | detached_sleeves | skindentation | ponytail | sitting | armpits | arms_up | spread_legs | anus | arms_behind_head | completely_nude | female_pubic_hair | mosaic_censoring | open_mouth | anal_hair | armpit_hair | ass | blurry_background | boots | gaping | mole_on_breast | palm_tree | pussy_juice | red_footwear | sky | spread_pussy | squatting | teeth | thighs | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------------|:----------|:-------------------|:--------------|:---------------|:----------------|:---------------|:-----------------|:--------------------|:--------------------|:-------------------|:-----------|:-----------|:------|:-----------|:---------------|:-----------------|:--------|:---------------|:--------|:--------------|:--------|:--------------|:-----------|:-------------|:------------------|:--------|:--------|:-------------|:--------|:-----------|:--------------------|:--------|:------------------|:-------------|:--------------|:---------|:------------------|:---------------|:-----------|:--------------|:------------------|:--------------|:-------------|:-------------|:-----------------------------|:-------|:---------|:----------|:--------|:-------------|:---------|:---------|:---------------|:-------------|:----------------------------|:------|:------------|:----------|:--------|:-------------|:---------------|:---------------------|:--------|:---------------|:----------------|:----------|:--------------|:-----------------|:-------------------|:----------------|:-----------|:----------|:----------|:----------|:--------------|:-------|:-------------------|:------------------|:--------------------|:-------------------|:-------------|:------------|:--------------|:------|:--------------------|:--------|:---------|:-----------------|:------------|:--------------|:---------------|:------|:---------------|:------------|:--------|:---------|:-------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | | | | | X | | | | X | X | X | | | | | X | X | X | X | | X | X | X | X | X | | X | | X | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | | | | | | | | X | X | X | | | | | X | X | | X | | X | X | X | | | X | | | X | | X | X | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | | | | | | | | X | | | | | | | | | X | X | | X | | X | | X | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 14 |  |  |  |  |  | X | X | | | X | | | | | | X | | | | | | | | X | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | X | | | X | | | | X | | | | | | X | | | X | | | | X | | | X | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_AA051610__FT | ---
pretty_name: Evaluation run of AA051610/FT
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/FT](https://huggingface.co/AA051610/FT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__FT\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T05:05:54.283989](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__FT/blob/main/results_2024-01-06T05-05-54.283989.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6938684291636229,\n\
\ \"acc_stderr\": 0.03064246100232873,\n \"acc_norm\": 0.6979884336114688,\n\
\ \"acc_norm_stderr\": 0.03123750719199374,\n \"mc1\": 0.42717258261933905,\n\
\ \"mc1_stderr\": 0.017316834410963933,\n \"mc2\": 0.5988138522091946,\n\
\ \"mc2_stderr\": 0.015356725964661566\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.01430694605273556,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491888\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6243776140211114,\n\
\ \"acc_stderr\": 0.004832934529120794,\n \"acc_norm\": 0.8278231428002389,\n\
\ \"acc_norm_stderr\": 0.003767625141611702\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7547169811320755,\n \"acc_stderr\": 0.026480357179895702,\n\
\ \"acc_norm\": 0.7547169811320755,\n \"acc_norm_stderr\": 0.026480357179895702\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.03396116205845335,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.03396116205845335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7319148936170212,\n \"acc_stderr\": 0.028957342788342343,\n\
\ \"acc_norm\": 0.7319148936170212,\n \"acc_norm_stderr\": 0.028957342788342343\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6896551724137931,\n \"acc_stderr\": 0.038552896163789485,\n\
\ \"acc_norm\": 0.6896551724137931,\n \"acc_norm_stderr\": 0.038552896163789485\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5343915343915344,\n \"acc_stderr\": 0.025690321762493848,\n \"\
acc_norm\": 0.5343915343915344,\n \"acc_norm_stderr\": 0.025690321762493848\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8290322580645161,\n \"acc_stderr\": 0.021417242936321582,\n \"\
acc_norm\": 0.8290322580645161,\n \"acc_norm_stderr\": 0.021417242936321582\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162934,\n \"\
acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8434343434343434,\n \"acc_stderr\": 0.025890520358141454,\n \"\
acc_norm\": 0.8434343434343434,\n \"acc_norm_stderr\": 0.025890520358141454\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7487179487179487,\n \"acc_stderr\": 0.02199201666237056,\n \
\ \"acc_norm\": 0.7487179487179487,\n \"acc_norm_stderr\": 0.02199201666237056\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857392,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857392\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176896,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176896\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603396,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603396\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8807339449541285,\n\
\ \"acc_stderr\": 0.01389572929258895,\n \"acc_norm\": 0.8807339449541285,\n\
\ \"acc_norm_stderr\": 0.01389572929258895\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n\
\ \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8823529411764706,\n \"acc_stderr\": 0.022613286601132012,\n \"\
acc_norm\": 0.8823529411764706,\n \"acc_norm_stderr\": 0.022613286601132012\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868837,\n \
\ \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868837\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7533632286995515,\n\
\ \"acc_stderr\": 0.028930413120910888,\n \"acc_norm\": 0.7533632286995515,\n\
\ \"acc_norm_stderr\": 0.028930413120910888\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744631,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744631\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719097,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719097\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.01911989279892498,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.01911989279892498\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8722860791826309,\n\
\ \"acc_stderr\": 0.011935626313999878,\n \"acc_norm\": 0.8722860791826309,\n\
\ \"acc_norm_stderr\": 0.011935626313999878\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7861271676300579,\n \"acc_stderr\": 0.022075709251757177,\n\
\ \"acc_norm\": 0.7861271676300579,\n \"acc_norm_stderr\": 0.022075709251757177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38994413407821227,\n\
\ \"acc_stderr\": 0.01631237662921307,\n \"acc_norm\": 0.38994413407821227,\n\
\ \"acc_norm_stderr\": 0.01631237662921307\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.02405102973991225,\n\
\ \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.02405102973991225\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445796,\n\
\ \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445796\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5332464146023468,\n\
\ \"acc_stderr\": 0.012741974333897213,\n \"acc_norm\": 0.5332464146023468,\n\
\ \"acc_norm_stderr\": 0.012741974333897213\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789524,\n\
\ \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789524\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7450980392156863,\n \"acc_stderr\": 0.017630827375148383,\n \
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.017630827375148383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276915,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276915\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42717258261933905,\n\
\ \"mc1_stderr\": 0.017316834410963933,\n \"mc2\": 0.5988138522091946,\n\
\ \"mc2_stderr\": 0.015356725964661566\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626922\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5807429871114481,\n \
\ \"acc_stderr\": 0.013591720959042115\n }\n}\n```"
repo_url: https://huggingface.co/AA051610/FT
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|arc:challenge|25_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|arc:challenge|25_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|arc:challenge|25_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|arc:challenge|25_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|gsm8k|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|gsm8k|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|gsm8k|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|gsm8k|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hellaswag|10_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hellaswag|10_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hellaswag|10_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hellaswag|10_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T02-53-50.876104.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T02-58-54.903140.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T05-04-05.292805.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T05-05-54.283989.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T05-05-54.283989.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- '**/details_harness|winogrande|5_2024-01-06T02-53-50.876104.parquet'
- split: 2024_01_06T02_58_54.903140
path:
- '**/details_harness|winogrande|5_2024-01-06T02-58-54.903140.parquet'
- split: 2024_01_06T05_04_05.292805
path:
- '**/details_harness|winogrande|5_2024-01-06T05-04-05.292805.parquet'
- split: 2024_01_06T05_05_54.283989
path:
- '**/details_harness|winogrande|5_2024-01-06T05-05-54.283989.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T05-05-54.283989.parquet'
- config_name: results
data_files:
- split: 2024_01_06T02_53_50.876104
path:
- results_2024-01-06T02-53-50.876104.parquet
- split: 2024_01_06T02_58_54.903140
path:
- results_2024-01-06T02-58-54.903140.parquet
- split: 2024_01_06T05_04_05.292805
path:
- results_2024-01-06T05-04-05.292805.parquet
- split: 2024_01_06T05_05_54.283989
path:
- results_2024-01-06T05-05-54.283989.parquet
- split: latest
path:
- results_2024-01-06T05-05-54.283989.parquet
---
# Dataset Card for Evaluation run of AA051610/FT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051610/FT](https://huggingface.co/AA051610/FT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__FT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T05:05:54.283989](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__FT/blob/main/results_2024-01-06T05-05-54.283989.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6938684291636229,
"acc_stderr": 0.03064246100232873,
"acc_norm": 0.6979884336114688,
"acc_norm_stderr": 0.03123750719199374,
"mc1": 0.42717258261933905,
"mc1_stderr": 0.017316834410963933,
"mc2": 0.5988138522091946,
"mc2_stderr": 0.015356725964661566
},
"harness|arc:challenge|25": {
"acc": 0.6015358361774744,
"acc_stderr": 0.01430694605273556,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491888
},
"harness|hellaswag|10": {
"acc": 0.6243776140211114,
"acc_stderr": 0.004832934529120794,
"acc_norm": 0.8278231428002389,
"acc_norm_stderr": 0.003767625141611702
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7547169811320755,
"acc_stderr": 0.026480357179895702,
"acc_norm": 0.7547169811320755,
"acc_norm_stderr": 0.026480357179895702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.03396116205845335,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.03396116205845335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7319148936170212,
"acc_stderr": 0.028957342788342343,
"acc_norm": 0.7319148936170212,
"acc_norm_stderr": 0.028957342788342343
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.038552896163789485,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.038552896163789485
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5343915343915344,
"acc_stderr": 0.025690321762493848,
"acc_norm": 0.5343915343915344,
"acc_norm_stderr": 0.025690321762493848
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.021417242936321582,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.021417242936321582
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8434343434343434,
"acc_stderr": 0.025890520358141454,
"acc_norm": 0.8434343434343434,
"acc_norm_stderr": 0.025890520358141454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7487179487179487,
"acc_stderr": 0.02199201666237056,
"acc_norm": 0.7487179487179487,
"acc_norm_stderr": 0.02199201666237056
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857392,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857392
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603396,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603396
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8807339449541285,
"acc_stderr": 0.01389572929258895,
"acc_norm": 0.8807339449541285,
"acc_norm_stderr": 0.01389572929258895
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8823529411764706,
"acc_stderr": 0.022613286601132012,
"acc_norm": 0.8823529411764706,
"acc_norm_stderr": 0.022613286601132012
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.020343400734868837,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.020343400734868837
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7533632286995515,
"acc_stderr": 0.028930413120910888,
"acc_norm": 0.7533632286995515,
"acc_norm_stderr": 0.028930413120910888
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744631,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744631
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719097,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719097
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.01911989279892498,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.01911989279892498
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8722860791826309,
"acc_stderr": 0.011935626313999878,
"acc_norm": 0.8722860791826309,
"acc_norm_stderr": 0.011935626313999878
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7861271676300579,
"acc_stderr": 0.022075709251757177,
"acc_norm": 0.7861271676300579,
"acc_norm_stderr": 0.022075709251757177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38994413407821227,
"acc_stderr": 0.01631237662921307,
"acc_norm": 0.38994413407821227,
"acc_norm_stderr": 0.01631237662921307
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.02405102973991225,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.02405102973991225
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445796,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445796
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5390070921985816,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.5390070921985816,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5332464146023468,
"acc_stderr": 0.012741974333897213,
"acc_norm": 0.5332464146023468,
"acc_norm_stderr": 0.012741974333897213
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789524,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789524
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276915,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276915
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42717258261933905,
"mc1_stderr": 0.017316834410963933,
"mc2": 0.5988138522091946,
"mc2_stderr": 0.015356725964661566
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626922
},
"harness|gsm8k|5": {
"acc": 0.5807429871114481,
"acc_stderr": 0.013591720959042115
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-eval-futin__feed-top_vi-71f14a-2175469965 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: facebook/opt-350m
metrics: []
dataset_name: futin/feed
dataset_config: top_vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-350m
* Dataset: futin/feed
* Config: top_vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
bilal01/stamp-verification-test | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 99500873.0
num_examples: 5
download_size: 0
dataset_size: 99500873.0
---
# Dataset Card for "stamp-verification-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shivam9980/headline-data | ---
dataset_info:
features:
- name: content
dtype: string
- name: headline
dtype: string
splits:
- name: train
num_bytes: 4101409
num_examples: 7000
download_size: 2515670
dataset_size: 4101409
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shenyulu/easyclimate-tutorial | ---
license: openrail
---
|
RafaelBlue/Ranozera2 | ---
license: openrail
---
|
zolak/twitter_dataset_81_1713090986 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2709118
num_examples: 6589
download_size: 1362719
dataset_size: 2709118
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HydraLM/weliveinasociety | ---
dataset_info:
features:
- name: conversations
list:
- name: input
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: cluster_text
dtype: string
- name: embedding
sequence: float64
splits:
- name: train
num_bytes: 1172872178
num_examples: 74998
download_size: 635977130
dataset_size: 1172872178
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "weliveinasociety"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
erenfazlioglu/turkiyecamiler50k | ---
license: apache-2.0
---
# Türkiye'de bulunan 50.000 camiye ait koordinatları ve adresleri içeren verisetidir. Kaynak gösterilmeden kullanılması yasaktır. |
open-llm-leaderboard/details_deepnight-research__llama-2-70B-inst | ---
pretty_name: Evaluation run of deepnight-research/llama-2-70B-inst
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [deepnight-research/llama-2-70B-inst](https://huggingface.co/deepnight-research/llama-2-70B-inst)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepnight-research__llama-2-70B-inst\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-10T00:26:50.478989](https://huggingface.co/datasets/open-llm-leaderboard/details_deepnight-research__llama-2-70B-inst/blob/main/results_2023-08-10T00%3A26%3A50.478989.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7050740464217434,\n\
\ \"acc_stderr\": 0.03085018588043536,\n \"acc_norm\": 0.7087855823993987,\n\
\ \"acc_norm_stderr\": 0.03081992944181276,\n \"mc1\": 0.44430844553243576,\n\
\ \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6224972679005382,\n\
\ \"mc2_stderr\": 0.014880875055625352\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6732081911262798,\n \"acc_stderr\": 0.013706665975587333,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6974706233817964,\n\
\ \"acc_stderr\": 0.00458414401465495,\n \"acc_norm\": 0.8789085839474209,\n\
\ \"acc_norm_stderr\": 0.0032556675321152857\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.029674167520101453,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.029674167520101453\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\
\ \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n\
\ \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.029771642712491227,\n\
\ \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.029771642712491227\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.46825396825396826,\n \"acc_stderr\": 0.025699352832131792,\n \"\
acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.025699352832131792\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n\
\ \"acc_stderr\": 0.02233170761182307,\n \"acc_norm\": 0.8096774193548387,\n\
\ \"acc_norm_stderr\": 0.02233170761182307\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5615763546798029,\n \"acc_stderr\": 0.03491207857486519,\n\
\ \"acc_norm\": 0.5615763546798029,\n \"acc_norm_stderr\": 0.03491207857486519\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528436,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528436\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880242,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880242\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n\
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02755361446786381,\n \
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02755361446786381\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.04075224992216979,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.04075224992216979\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9027522935779817,\n \"acc_stderr\": 0.012703533408540366,\n \"\
acc_norm\": 0.9027522935779817,\n \"acc_norm_stderr\": 0.012703533408540366\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002157,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002157\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8684546615581098,\n\
\ \"acc_stderr\": 0.01208670521425043,\n \"acc_norm\": 0.8684546615581098,\n\
\ \"acc_norm_stderr\": 0.01208670521425043\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.022289638852617893,\n\
\ \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.022289638852617893\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6044692737430167,\n\
\ \"acc_stderr\": 0.01635341541007577,\n \"acc_norm\": 0.6044692737430167,\n\
\ \"acc_norm_stderr\": 0.01635341541007577\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7781350482315113,\n\
\ \"acc_stderr\": 0.02359885829286305,\n \"acc_norm\": 0.7781350482315113,\n\
\ \"acc_norm_stderr\": 0.02359885829286305\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.020736358408060006,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.020736358408060006\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.029494827600144366,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.029494827600144366\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5521512385919165,\n\
\ \"acc_stderr\": 0.012700582404768235,\n \"acc_norm\": 0.5521512385919165,\n\
\ \"acc_norm_stderr\": 0.012700582404768235\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.02667925227010314,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.02667925227010314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7647058823529411,\n \"acc_stderr\": 0.01716058723504635,\n \
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.01716058723504635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n\
\ \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6224972679005382,\n\
\ \"mc2_stderr\": 0.014880875055625352\n }\n}\n```"
repo_url: https://huggingface.co/deepnight-research/llama-2-70B-inst
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|arc:challenge|25_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hellaswag|10_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T00:26:50.478989.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:26:50.478989.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T00:26:50.478989.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T00:26:50.478989.parquet'
- config_name: results
data_files:
- split: 2023_08_10T00_26_50.478989
path:
- results_2023-08-10T00:26:50.478989.parquet
- split: latest
path:
- results_2023-08-10T00:26:50.478989.parquet
---
# Dataset Card for Evaluation run of deepnight-research/llama-2-70B-inst
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/deepnight-research/llama-2-70B-inst
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [deepnight-research/llama-2-70B-inst](https://huggingface.co/deepnight-research/llama-2-70B-inst) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_deepnight-research__llama-2-70B-inst",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-10T00:26:50.478989](https://huggingface.co/datasets/open-llm-leaderboard/details_deepnight-research__llama-2-70B-inst/blob/main/results_2023-08-10T00%3A26%3A50.478989.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7050740464217434,
"acc_stderr": 0.03085018588043536,
"acc_norm": 0.7087855823993987,
"acc_norm_stderr": 0.03081992944181276,
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6224972679005382,
"mc2_stderr": 0.014880875055625352
},
"harness|arc:challenge|25": {
"acc": 0.6732081911262798,
"acc_stderr": 0.013706665975587333,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393441
},
"harness|hellaswag|10": {
"acc": 0.6974706233817964,
"acc_stderr": 0.00458414401465495,
"acc_norm": 0.8789085839474209,
"acc_norm_stderr": 0.0032556675321152857
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.029674167520101453,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.029674167520101453
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093274,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093274
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7063829787234043,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.7063829787234043,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.025699352832131792,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.025699352832131792
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.02233170761182307,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.02233170761182307
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5615763546798029,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.5615763546798029,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528436,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528436
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880242,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880242
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687968,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687968
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02755361446786381,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02755361446786381
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.04075224992216979,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.04075224992216979
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9027522935779817,
"acc_stderr": 0.012703533408540366,
"acc_norm": 0.9027522935779817,
"acc_norm_stderr": 0.012703533408540366
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.01999556072375854,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.01999556072375854
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002157,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002157
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8684546615581098,
"acc_stderr": 0.01208670521425043,
"acc_norm": 0.8684546615581098,
"acc_norm_stderr": 0.01208670521425043
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.022289638852617893,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.022289638852617893
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6044692737430167,
"acc_stderr": 0.01635341541007577,
"acc_norm": 0.6044692737430167,
"acc_norm_stderr": 0.01635341541007577
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7781350482315113,
"acc_stderr": 0.02359885829286305,
"acc_norm": 0.7781350482315113,
"acc_norm_stderr": 0.02359885829286305
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.020736358408060006,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.020736358408060006
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5521512385919165,
"acc_stderr": 0.012700582404768235,
"acc_norm": 0.5521512385919165,
"acc_norm_stderr": 0.012700582404768235
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.02667925227010314,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.02667925227010314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8204081632653061,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.8204081632653061,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015575,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015575
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6224972679005382,
"mc2_stderr": 0.014880875055625352
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
MMVP/MMVP_VLM | ---
license: mit
task_categories:
- zero-shot-classification
size_categories:
- n<1K
---
# MMVP-VLM Benchmark Datacard
## Basic Information
**Title:** MMVP-VLM Benchmark
**Description:** The MMVP-VLM (Multimodal Visual Patterns - Visual Language Models) Benchmark is designed to systematically evaluate the performance of recent CLIP-based models in understanding and processing visual patterns. It distills a subset of questions from the original MMVP benchmark into simpler language descriptions, categorizing them into distinct visual patterns. Each visual pattern is represented by 15 text-image pairs. The benchmark assesses whether CLIP models can accurately match these image-text combinations, providing insights into the capabilities and limitations of these models.
## Dataset Details
- **Content Types:** Text-Image Pairs
- **Volume:** Balanced number of questions for each visual pattern, with each pattern represented by 15 pairs.
- **Source of Data:** Subset from MMVP benchmark, supplemented with additional questions for balance
- **Data Collection Method:** Distillation and categorization of questions from MMVP benchmark into simpler language
## Usage
### Intended Use
- Evaluation of CLIP models' ability to understand and process various visual patterns.
|
mxronga/radioibadan | ---
license: apache-2.0
language:
- yo
tags:
- pretrain
--- |
FudanSELab/SO_KGXQR_DUPLICATE | ---
license: mit
dataset_info:
- config_name: duplicate_csharp
features:
- name: query
dtype: string
- name: relevant
sequence: string
splits:
- name: test
num_bytes: 91485
num_examples: 1200
download_size: 61619
dataset_size: 91485
- config_name: duplicate_java
features:
- name: query
dtype: string
- name: relevant
sequence: string
splits:
- name: test
num_bytes: 102838
num_examples: 1200
download_size: 69239
dataset_size: 102838
- config_name: duplicate_javascript
features:
- name: query
dtype: string
- name: relevant
sequence: string
splits:
- name: test
num_bytes: 107321
num_examples: 1200
download_size: 69456
dataset_size: 107321
- config_name: duplicate_python
features:
- name: query
dtype: string
- name: relevant
sequence: string
splits:
- name: test
num_bytes: 109709
num_examples: 1200
download_size: 73833
dataset_size: 109709
configs:
- config_name: duplicate_csharp
data_files:
- split: test
path: duplicate_csharp/test-*
- config_name: duplicate_java
data_files:
- split: test
path: duplicate_java/test-*
- config_name: duplicate_javascript
data_files:
- split: test
path: duplicate_javascript/test-*
- config_name: duplicate_python
data_files:
- split: test
path: duplicate_python/test-*
language:
- en
size_categories:
- 1K<n<10K
---
## Dataset Description
- **Repository:** [GitHub Repository](https://kgxqr.github.io/) |
KatoHF/chatbot_arena_binarized | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 54050415.72260606
num_examples: 46588
download_size: 25102598
dataset_size: 54050415.72260606
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/wikiclir_ru | ---
pretty_name: '`wikiclir/ru`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikiclir/ru`
The `wikiclir/ru` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikiclir#wikiclir/ru).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=1,413,945
- `queries` (i.e., topics); count=664,924
- `qrels`: (relevance assessments); count=2,321,384
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikiclir_ru', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'text': ...}
queries = load_dataset('irds/wikiclir_ru', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/wikiclir_ru', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{sasaki-etal-2018-cross,
title = "Cross-Lingual Learning-to-Rank with Shared Representations",
author = "Sasaki, Shota and
Sun, Shuo and
Schamoni, Shigehiko and
Duh, Kevin and
Inui, Kentaro",
booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)",
month = jun,
year = "2018",
address = "New Orleans, Louisiana",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N18-2073",
doi = "10.18653/v1/N18-2073",
pages = "458--463"
}
```
|
autoevaluate/autoeval-eval-imdb-plain_text-87fbde-67095145592 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- imdb
eval_info:
task: summarization
model: t5-small
metrics: []
dataset_name: imdb
dataset_config: plain_text
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: t5-small
* Dataset: imdb
* Config: plain_text
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@michaeldesmond](https://huggingface.co/michaeldesmond) for evaluating this model. |
open-llm-leaderboard/details_Aspik101__vicuna-7b-v1.3-instruct-pl-lora_unload | ---
pretty_name: Evaluation run of Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload](https://huggingface.co/Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__vicuna-7b-v1.3-instruct-pl-lora_unload\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T04:12:47.025545](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__vicuna-7b-v1.3-instruct-pl-lora_unload/blob/main/results_2023-09-23T04-12-47.025545.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002307046979865772,\n\
\ \"em_stderr\": 0.0004913221265094532,\n \"f1\": 0.05567638422818793,\n\
\ \"f1_stderr\": 0.001338509283292818,\n \"acc\": 0.38151825095307307,\n\
\ \"acc_stderr\": 0.009759837355311614\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094532,\n\
\ \"f1\": 0.05567638422818793,\n \"f1_stderr\": 0.001338509283292818\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0621683093252464,\n \
\ \"acc_stderr\": 0.00665103564453169\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7008681925808997,\n \"acc_stderr\": 0.012868639066091536\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|arc:challenge|25_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T04_12_47.025545
path:
- '**/details_harness|drop|3_2023-09-23T04-12-47.025545.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T04-12-47.025545.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T04_12_47.025545
path:
- '**/details_harness|gsm8k|5_2023-09-23T04-12-47.025545.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T04-12-47.025545.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hellaswag|10_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T09:51:14.882748.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-25T09:51:14.882748.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-25T09:51:14.882748.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T04_12_47.025545
path:
- '**/details_harness|winogrande|5_2023-09-23T04-12-47.025545.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T04-12-47.025545.parquet'
- config_name: results
data_files:
- split: 2023_07_25T09_51_14.882748
path:
- results_2023-07-25T09:51:14.882748.parquet
- split: 2023_09_23T04_12_47.025545
path:
- results_2023-09-23T04-12-47.025545.parquet
- split: latest
path:
- results_2023-09-23T04-12-47.025545.parquet
---
# Dataset Card for Evaluation run of Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload](https://huggingface.co/Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aspik101__vicuna-7b-v1.3-instruct-pl-lora_unload",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T04:12:47.025545](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__vicuna-7b-v1.3-instruct-pl-lora_unload/blob/main/results_2023-09-23T04-12-47.025545.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094532,
"f1": 0.05567638422818793,
"f1_stderr": 0.001338509283292818,
"acc": 0.38151825095307307,
"acc_stderr": 0.009759837355311614
},
"harness|drop|3": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094532,
"f1": 0.05567638422818793,
"f1_stderr": 0.001338509283292818
},
"harness|gsm8k|5": {
"acc": 0.0621683093252464,
"acc_stderr": 0.00665103564453169
},
"harness|winogrande|5": {
"acc": 0.7008681925808997,
"acc_stderr": 0.012868639066091536
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
shields/catalan_commonvoice_first15hr_processed | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 6723710888
num_examples: 7000
- name: val
num_bytes: 2881592776
num_examples: 3000
download_size: 1776942256
dataset_size: 9605303664
---
# Dataset Card for "catalan_commonvoice_first15hr_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SFVII/test-data | ---
task_categories:
- text-generation
size_categories:
- n<1K
--- |
open-llm-leaderboard/details_Sao10K__Test-Instruct-Solar-v1 | ---
pretty_name: Evaluation run of Sao10K/Test-Instruct-Solar-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Test-Instruct-Solar-v1](https://huggingface.co/Sao10K/Test-Instruct-Solar-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Test-Instruct-Solar-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T15:38:51.423124](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Test-Instruct-Solar-v1/blob/main/results_2024-02-10T15-38-51.423124.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6669839904344624,\n\
\ \"acc_stderr\": 0.031337299441269166,\n \"acc_norm\": 0.6676028712450298,\n\
\ \"acc_norm_stderr\": 0.031976268016343144,\n \"mc1\": 0.4883720930232558,\n\
\ \"mc1_stderr\": 0.017498767175740088,\n \"mc2\": 0.6263828040191523,\n\
\ \"mc2_stderr\": 0.015723023734478345\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6732081911262798,\n \"acc_stderr\": 0.013706665975587331,\n\
\ \"acc_norm\": 0.7039249146757679,\n \"acc_norm_stderr\": 0.01334091608524625\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.698864767974507,\n\
\ \"acc_stderr\": 0.004578137949298176,\n \"acc_norm\": 0.8776140211113324,\n\
\ \"acc_norm_stderr\": 0.003270612753613392\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n\
\ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4894179894179894,\n \"acc_stderr\": 0.025745542276045478,\n \"\
acc_norm\": 0.4894179894179894,\n \"acc_norm_stderr\": 0.025745542276045478\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8258064516129032,\n \"acc_stderr\": 0.021576248184514573,\n \"\
acc_norm\": 0.8258064516129032,\n \"acc_norm_stderr\": 0.021576248184514573\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"\
acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \
\ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372174,\n\
\ \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372174\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \"\
acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n\
\ \"acc_stderr\": 0.014248873549217576,\n \"acc_norm\": 0.8020434227330779,\n\
\ \"acc_norm_stderr\": 0.014248873549217576\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39217877094972065,\n\
\ \"acc_stderr\": 0.01632906107320745,\n \"acc_norm\": 0.39217877094972065,\n\
\ \"acc_norm_stderr\": 0.01632906107320745\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n\
\ \"acc_stderr\": 0.025122637608816646,\n \"acc_norm\": 0.7331189710610932,\n\
\ \"acc_norm_stderr\": 0.025122637608816646\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262196,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262196\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48891786179921776,\n\
\ \"acc_stderr\": 0.01276709899852584,\n \"acc_norm\": 0.48891786179921776,\n\
\ \"acc_norm_stderr\": 0.01276709899852584\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6977124183006536,\n \"acc_stderr\": 0.01857923271111388,\n \
\ \"acc_norm\": 0.6977124183006536,\n \"acc_norm_stderr\": 0.01857923271111388\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4883720930232558,\n\
\ \"mc1_stderr\": 0.017498767175740088,\n \"mc2\": 0.6263828040191523,\n\
\ \"mc2_stderr\": 0.015723023734478345\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785722\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6679302501895376,\n \
\ \"acc_stderr\": 0.012972465034361856\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Test-Instruct-Solar-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|arc:challenge|25_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|gsm8k|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hellaswag|10_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T15-38-51.423124.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T15-38-51.423124.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- '**/details_harness|winogrande|5_2024-02-10T15-38-51.423124.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T15-38-51.423124.parquet'
- config_name: results
data_files:
- split: 2024_02_10T15_38_51.423124
path:
- results_2024-02-10T15-38-51.423124.parquet
- split: latest
path:
- results_2024-02-10T15-38-51.423124.parquet
---
# Dataset Card for Evaluation run of Sao10K/Test-Instruct-Solar-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/Test-Instruct-Solar-v1](https://huggingface.co/Sao10K/Test-Instruct-Solar-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Test-Instruct-Solar-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T15:38:51.423124](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Test-Instruct-Solar-v1/blob/main/results_2024-02-10T15-38-51.423124.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6669839904344624,
"acc_stderr": 0.031337299441269166,
"acc_norm": 0.6676028712450298,
"acc_norm_stderr": 0.031976268016343144,
"mc1": 0.4883720930232558,
"mc1_stderr": 0.017498767175740088,
"mc2": 0.6263828040191523,
"mc2_stderr": 0.015723023734478345
},
"harness|arc:challenge|25": {
"acc": 0.6732081911262798,
"acc_stderr": 0.013706665975587331,
"acc_norm": 0.7039249146757679,
"acc_norm_stderr": 0.01334091608524625
},
"harness|hellaswag|10": {
"acc": 0.698864767974507,
"acc_stderr": 0.004578137949298176,
"acc_norm": 0.8776140211113324,
"acc_norm_stderr": 0.003270612753613392
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4894179894179894,
"acc_stderr": 0.025745542276045478,
"acc_norm": 0.4894179894179894,
"acc_norm_stderr": 0.025745542276045478
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514573,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514573
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.022828881775249377,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.022828881775249377
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.029213549414372174,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.029213549414372174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217576,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217576
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39217877094972065,
"acc_stderr": 0.01632906107320745,
"acc_norm": 0.39217877094972065,
"acc_norm_stderr": 0.01632906107320745
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.025122637608816646,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.025122637608816646
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262196,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262196
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48891786179921776,
"acc_stderr": 0.01276709899852584,
"acc_norm": 0.48891786179921776,
"acc_norm_stderr": 0.01276709899852584
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6977124183006536,
"acc_stderr": 0.01857923271111388,
"acc_norm": 0.6977124183006536,
"acc_norm_stderr": 0.01857923271111388
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4883720930232558,
"mc1_stderr": 0.017498767175740088,
"mc2": 0.6263828040191523,
"mc2_stderr": 0.015723023734478345
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.010329712832785722
},
"harness|gsm8k|5": {
"acc": 0.6679302501895376,
"acc_stderr": 0.012972465034361856
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zhangshuoming/math_23k_double_kernel | ---
dataset_info:
features:
- name: text
struct:
- name: asm
dtype: string
- name: c
dtype: string
- name: driver
dtype: string
splits:
- name: train
num_bytes: 21695095
num_examples: 21104
download_size: 1822925
dataset_size: 21695095
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "math_23k_double_kernel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aviroes/above_70yo_elderly_people_datasetV2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 196941356.0
num_examples: 4215
- name: test
num_bytes: 8586642.0
num_examples: 166
- name: validation
num_bytes: 4592657.0
num_examples: 100
download_size: 192899099
dataset_size: 210120655.0
---
# Dataset Card for "above_70yo_elderly_people_datasetV2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Minglii/r_pv4_alp | ---
dataset_info:
features:
- name: data
struct:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 65092911
num_examples: 37291
download_size: 33664037
dataset_size: 65092911
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "r_pv4_alp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vitorbr2009/ds-voz-marola | ---
license: openrail
---
|
maidalun1020/CrosslingualRetrievalLawEn2Zh-qrels | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
dataset_info:
features:
- name: qid
dtype: string
- name: pid
dtype: string
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 668730
num_examples: 27458
download_size: 357897
dataset_size: 668730
---
|
mstz/house16 | ---
language:
- en
tags:
- house16
- tabular_classification
- binary_classification
pretty_name: House16
size_categories:
- 10k<n<100K
task_categories:
- tabular-classification
configs:
- house16
license: cc
---
# House16
The [House16 dataset](https://www.openml.org/search?type=data&sort=runs&id=821&status=active) from the [OpenML repository](https://www.openml.org/).
# Configurations and tasks
| **Configuration** | **Task** |
|-------------------|---------------------------|
| house16 | Binary classification |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/house16", "house16")["train"]
``` |
hp03670/hugo | ---
license: openrail
---
|
camilaslz/abarata | ---
license: openrail
---
|
Loquats/tiny_dataset | ---
dataset_info:
features:
- name: A
dtype: int64
- name: B
dtype: int64
splits:
- name: train
num_bytes: 80
num_examples: 5
download_size: 1321
dataset_size: 80
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
iamshnoo/alpaca-cleaned-swahili | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 32934835
num_examples: 51760
download_size: 18254346
dataset_size: 32934835
---
Translated from yahma/alpaca-cleaned using NLLB-1.3B
# Dataset Card for "alpaca-cleaned-swahili"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cong1230/Mental_illness_chatbot_training_dataset | ---
license: mit
---
|
rescer/twitter_dataset_1713229929 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 1654221
num_examples: 5204
download_size: 937854
dataset_size: 1654221
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kgr123/quality_counter_2000_4_simple | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 11253555
num_examples: 1926
- name: train
num_bytes: 11157700
num_examples: 1935
- name: validation
num_bytes: 11369448
num_examples: 1941
download_size: 7655452
dataset_size: 33780703
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
vadis/sv-ident | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
- de
license:
- mit
multilinguality:
- multilingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-label-classification
- semantic-similarity-classification
pretty_name: SV-Ident
paperswithcode_id: sv-ident
---
# Dataset Card for SV-Ident
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://vadis-project.github.io/sv-ident-sdp2022/
- **Repository:** https://github.com/vadis-project/sv-ident
- **Paper:** [Needs More Information]
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** svident2022@googlegroups.com
### Dataset Summary
SV-Ident comprises 4,248 sentences from social science publications in English and German. The data is the official data for the Shared Task: “Survey Variable Identification in Social Science Publications” (SV-Ident) 2022. Visit the homepage to find out more details about the shared task.
### Supported Tasks and Leaderboards
The dataset supports:
- **Variable Detection**: identifying whether a sentence contains a variable mention or not.
- **Variable Disambiguation**: identifying which variable from a given vocabulary is mentioned in a sentence. **NOTE**: for this task, you will need to also download the variable metadata from [here](https://bit.ly/3Nuvqdu).
### Languages
The text in the dataset is in English and German, as written by researchers. The domain of the texts is scientific publications in the social sciences.
## Dataset Structure
### Data Instances
```
{
"sentence": "Our point, however, is that so long as downward (favorable comparisons overwhelm the potential for unfavorable comparisons, system justification should be a likely outcome amongst the disadvantaged.",
"is_variable": 1,
"variable": ["exploredata-ZA5400_VarV66", "exploredata-ZA5400_VarV53"],
"research_data": ["ZA5400"],
"doc_id": "73106",
"uuid": "b9fbb80f-3492-4b42-b9d5-0254cc33ac10",
"lang": "en",
}
```
### Data Fields
The following data fields are provided for documents:
```
`sentence`: Textual instance, which may contain a variable mention.<br />
`is_variable`: Label, whether the textual instance contains a variable mention (1) or not (0). This column can be used for Task 1 (Variable Detection).<br />
`variable`: Variables (separated by a comma ";") that are mentioned in the textual instance. This column can be used for Task 2 (Variable Disambiguation). Variables with the "unk" tag could not be mapped to a unique variable.<br />
`research_data`: Research data IDs (separated by a ";") that are relevant for each instance (and in general for each "doc_id").<br />
`doc_id`: ID of the source document. Each document is written in one language (either English or German).<br />
`uuid`: Unique ID of the instance in uuid4 format.<br />
`lang`: Language of the sentence.
```
The language for each document can be found in the document-language mapping file [here](https://github.com/vadis-project/sv-ident/blob/main/data/train/document_languages.json), which maps `doc_id` to a language code (`en`, `de`). The variables metadata (i.e., the vocabulary) can be downloaded from this [link](https://bit.ly/3Nuvqdu). Note, that each `research_data` contains hundreds of variables (these can be understood as the corpus of documents to choose the most relevant from). If the variable has an "unk" tag, it means that the sentence contains a variable that has not been disambiguated. Such sentences could be used for Task 1 and filtered out for Task 2. The metadata file has the following format:
```
{
"research_data_id_1": {
"variable_id_1": VARIABLE_METADATA,
...
"variable_id_n": VARIABLE_METADATA,
},
...
"research_data_id_n": {...},
}
```
Each variable may contain all (or some) of the following values:
```
study_title: The title of the research data study.
variable_label: The label of the variable.
variable_name: The name of the variable.
question_text: The question of the variable in the original language.
question_text_en: The question of the variable in English.
sub_question: The sub-question of the variable.
item_categories: The item categories of the variable.
answer_categories: The answers of the variable.
topic: The topics of the variable in the original language.
topic_en: The topics of the variable in English.
```
### Data Splits
| Split | Number of sentences |
| ------------------- | ------------------------------------ |
| Train | 3,823 |
| Validation | 425 |
## Dataset Creation
### Curation Rationale
The dataset was curated by the VADIS project (https://vadis-project.github.io/).
The documents were annotated by two expert annotators.
### Source Data
#### Initial Data Collection and Normalization
The original data are available at GESIS (https://www.gesis.org/home) in an unprocessed format.
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
The documents were annotated by two expert annotators.
### Personal and Sensitive Information
The dataset does not include personal or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
VADIS project (https://vadis-project.github.io/)
### Licensing Information
All documents originate from the Social Science Open Access Repository (SSOAR) and are licensed accordingly. The original document URLs are provided in [document_urls.json](https://github.com/vadis-project/sv-ident/blob/main/data/train/document_urlsjson). For more information on licensing, please refer to the terms and conditions on the [SSAOR Grant of Licenses page](https://www.gesis.org/en/ssoar/home/information/grant-of-licences).
### Citation Information
```
@inproceedings{tsereteli-etal-2022-overview,
title = "Overview of the {SV}-Ident 2022 Shared Task on Survey Variable Identification in Social Science Publications",
author = "Tsereteli, Tornike and
Kartal, Yavuz Selim and
Ponzetto, Simone Paolo and
Zielinski, Andrea and
Eckert, Kai and
Mayr, Philipp",
booktitle = "Proceedings of the Third Workshop on Scholarly Document Processing",
month = oct,
year = "2022",
address = "Gyeongju, Republic of Korea",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.sdp-1.29",
pages = "229--246",
abstract = "In this paper, we provide an overview of the SV-Ident shared task as part of the 3rd Workshop on Scholarly Document Processing (SDP) at COLING 2022. In the shared task, participants were provided with a sentence and a vocabulary of variables, and asked to identify which variables, if any, are mentioned in individual sentences from scholarly documents in full text. Two teams made a total of 9 submissions to the shared task leaderboard. While none of the teams improve on the baseline systems, we still draw insights from their submissions. Furthermore, we provide a detailed evaluation. Data and baselines for our shared task are freely available at \url{https://github.com/vadis-project/sv-ident}.",
}
```
### Contributions
[Needs More Information] |
mask-distilled-one-sec-cv12/chunk_25 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 839574052
num_examples: 164881
download_size: 856326270
dataset_size: 839574052
---
# Dataset Card for "chunk_25"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DanteAl97/hindi-english-translation | ---
license: mit
---
|
aslessor/passports | ---
language:
- en
license: mit
size_categories:
- n<1K
task_categories:
- visual-question-answering
pretty_name: Passports
tags:
- kyc
- passports
dataset_info:
features:
- name: image
dtype: image
- name: label_string
sequence: string
- name: words
sequence: string
- name: labels
sequence: int64
- name: boxes
sequence:
sequence: int64
splits:
- name: train
num_bytes: 34324486.0
num_examples: 100
- name: valid
num_bytes: 2769718.0
num_examples: 9
download_size: 36565385
dataset_size: 37094204.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
---
Source - https://www.dropbox.com/s/omintwb3k2h46kk/passport_dataset.zip |
scvg/zeppelin-new | ---
license: apache-2.0
---
|
julien-c/autotrain-data-dog-classifiers | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: dog-classifiers
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project dog-classifiers.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<474x592 RGB PIL image>",
"target": 1
},
{
"image": "<474x296 RGB PIL image>",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(num_classes=5, names=['akita inu', 'corgi', 'leonberger', 'samoyed', 'shiba inu'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 598 |
| valid | 150 |
|
sayakpaul/dummy-controlnet-100000-samples | ---
dataset_info:
features:
- name: image
dtype: image
- name: condtioning_image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 157617435976.0
num_examples: 100000
download_size: 157623508466
dataset_size: 157617435976.0
---
# Dataset Card for "dummy-controlnet-100000-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_qqp_drop_inf_to | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1811611
num_examples: 10789
- name: test
num_bytes: 18145678
num_examples: 107361
- name: train
num_bytes: 16263832
num_examples: 96674
download_size: 22460178
dataset_size: 36221121
---
# Dataset Card for "MULTI_VALUE_qqp_drop_inf_to"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HAERAE-HUB/KMMLU-HARD | ---
configs:
- config_name: maritime_engineering
data_files:
- split: dev
path: data/maritime_engineering-dev.csv
- split: test
path: data/maritime_engineering-hard-test.csv
- config_name: materials_engineering
data_files:
- split: dev
path: data/materials_engineering-dev.csv
- split: test
path: data/materials_engineering-hard-test.csv
- config_name: railway_and_automotive_engineering
data_files:
- split: dev
path: data/railway_and_automotive_engineering-dev.csv
- split: test
path: data/railway_and_automotive_engineering-hard-test.csv
- config_name: biology
data_files:
- split: dev
path: data/biology-dev.csv
- split: test
path: data/biology-hard-test.csv
- config_name: public_safety
data_files:
- split: dev
path: data/public_safety-dev.csv
- split: test
path: data/public_safety-hard-test.csv
- config_name: criminal_law
data_files:
- split: dev
path: data/criminal_law-dev.csv
- split: test
path: data/criminal_law-hard-test.csv
- config_name: information_technology
data_files:
- split: dev
path: data/information_technology-dev.csv
- split: test
path: data/information_technology-hard-test.csv
- config_name: geomatics
data_files:
- split: dev
path: data/geomatics-dev.csv
- split: test
path: data/geomatics-hard-test.csv
- config_name: management
data_files:
- split: dev
path: data/management-dev.csv
- split: test
path: data/management-hard-test.csv
- config_name: math
data_files:
- split: dev
path: data/math-dev.csv
- split: test
path: data/math-hard-test.csv
- config_name: accounting
data_files:
- split: dev
path: data/accounting-dev.csv
- split: test
path: data/accounting-hard-test.csv
- config_name: chemistry
data_files:
- split: dev
path: data/chemistry-dev.csv
- split: test
path: data/chemistry-hard-test.csv
- config_name: nondestructive_testing
data_files:
- split: dev
path: data/nondestructive_testing-dev.csv
- split: test
path: data/nondestructive_testing-hard-test.csv
- config_name: computer_science
data_files:
- split: dev
path: data/computer_science-dev.csv
- split: test
path: data/computer_science-hard-test.csv
- config_name: ecology
data_files:
- split: dev
path: data/ecology-dev.csv
- split: test
path: data/ecology-hard-test.csv
- config_name: health
data_files:
- split: dev
path: data/health-dev.csv
- split: test
path: data/health-hard-test.csv
- config_name: political_science_and_sociology
data_files:
- split: dev
path: data/political_science_and_sociology-dev.csv
- split: test
path: data/political_science_and_sociology-hard-test.csv
- config_name: patent
data_files:
- split: dev
path: data/patent-dev.csv
- split: test
path: data/patent-hard-test.csv
- config_name: electrical_engineering
data_files:
- split: dev
path: data/electrical_engineering-dev.csv
- split: test
path: data/electrical_engineering-hard-test.csv
- config_name: electronics_engineering
data_files:
- split: dev
path: data/electronics_engineering-dev.csv
- split: test
path: data/electronics_engineering-hard-test.csv
- config_name: korean_history
data_files:
- split: dev
path: data/korean_history-dev.csv
- split: test
path: data/korean_history-hard-test.csv
- config_name: gas_technology_and_engineering
data_files:
- split: dev
path: data/gas_technology_and_engineering-dev.csv
- split: test
path: data/gas_technology_and_engineering-hard-test.csv
- config_name: machine_design_and_manufacturing
data_files:
- split: dev
path: data/machine_design_and_manufacturing-dev.csv
- split: test
path: data/machine_design_and_manufacturing-hard-test.csv
- config_name: chemical_engineering
data_files:
- split: dev
path: data/chemical_engineering-dev.csv
- split: test
path: data/chemical_engineering-hard-test.csv
- config_name: telecommunications_and_wireless_technology
data_files:
- split: dev
path: data/telecommunications_and_wireless_technology-dev.csv
- split: test
path: data/telecommunications_and_wireless_technology-hard-test.csv
- config_name: food_processing
data_files:
- split: dev
path: data/food_processing-dev.csv
- split: test
path: data/food_processing-hard-test.csv
- config_name: social_welfare
data_files:
- split: dev
path: data/social_welfare-dev.csv
- split: test
path: data/social_welfare-hard-test.csv
- config_name: real_estate
data_files:
- split: dev
path: data/real_estate-dev.csv
- split: test
path: data/real_estate-hard-test.csv
- config_name: marketing
data_files:
- split: dev
path: data/marketing-dev.csv
- split: test
path: data/marketing-hard-test.csv
- config_name: mechanical_engineering
data_files:
- split: dev
path: data/mechanical_engineering-dev.csv
- split: test
path: data/mechanical_engineering-hard-test.csv
- config_name: fashion
data_files:
- split: dev
path: data/fashion-dev.csv
- split: test
path: data/fashion-hard-test.csv
- config_name: psychology
data_files:
- split: dev
path: data/psychology-dev.csv
- split: test
path: data/psychology-hard-test.csv
- config_name: taxation
data_files:
- split: dev
path: data/taxation-dev.csv
- split: test
path: data/taxation-hard-test.csv
- config_name: environmental_science
data_files:
- split: dev
path: data/environmental_science-dev.csv
- split: test
path: data/environmental_science-hard-test.csv
- config_name: refrigerating_machinery
data_files:
- split: dev
path: data/refrigerating_machinery-dev.csv
- split: test
path: data/refrigerating_machinery-hard-test.csv
- config_name: education
data_files:
- split: dev
path: data/education-dev.csv
- split: test
path: data/education-hard-test.csv
- config_name: industrial_engineer
data_files:
- split: dev
path: data/industrial_engineer-dev.csv
- split: test
path: data/industrial_engineer-hard-test.csv
- config_name: civil_engineering
data_files:
- split: dev
path: data/civil_engineering-dev.csv
- split: test
path: data/civil_engineering-hard-test.csv
- config_name: energy_management
data_files:
- split: dev
path: data/energy_management-dev.csv
- split: test
path: data/energy_management-hard-test.csv
- config_name: law
data_files:
- split: dev
path: data/law-dev.csv
- split: test
path: data/law-hard-test.csv
- config_name: agricultural_sciences
data_files:
- split: dev
path: data/agricultural_sciences-dev.csv
- split: test
path: data/agricultural_sciences-hard-test.csv
- config_name: interior_architecture_and_design
data_files:
- split: dev
path: data/interior_architecture_and_design-dev.csv
- split: test
path: data/interior_architecture_and_design-hard-test.csv
- config_name: aviation_engineering_and_maintenance
data_files:
- split: dev
path: data/aviation_engineering_and_maintenance-dev.csv
- split: test
path: data/aviation_engineering_and_maintenance-hard-test.csv
- config_name: construction
data_files:
- split: dev
path: data/construction-dev.csv
- split: test
path: data/construction-hard-test.csv
- config_name: economics
data_files:
- split: dev
path: data/economics-dev.csv
- split: test
path: data/economics-hard-test.csv
license: cc-by-nd-4.0
task_categories:
- question-answering
language:
- ko
tags:
- haerae
- mmlu
size_categories:
- 100K<n<1M
---
### KMMLU (Korean-MMLU)
We propose KMMLU, a new Korean benchmark with 35,030 expert-level multiple-choice questions across 45 subjects ranging from humanities to STEM.
Unlike previous Korean benchmarks that are translated from existing English benchmarks, KMMLU is collected from original Korean exams, capturing linguistic and cultural aspects of the Korean language.
We test 26 publically available and proprietary LLMs, identifying significant room for improvement.
The best publicly available model achieves 50.54% on KMMLU, far below the average human performance of 62.6%.
This model was primarily trained for English and Chinese, not Korean.
Current LLMs tailored to Korean, such as Polyglot-Ko, perform far worse. Surprisingly, even the most capable proprietary LLMs, e.g., GPT-4 and HyperCLOVA X, achieve 59.95% and 53.40%, respectively.
This suggests that further work is needed to improve Korean LLMs, and KMMLU offers the right tool to track this progress.
We make our dataset publicly available on the Hugging Face Hub and integrate the benchmark into EleutherAI's Language Model Evaluation Harness.
Link to Paper: [KMMLU: Measuring Massive Multitask Language Understanding in Korean](https://arxiv.org/abs/2402.11548)
### KMMLU Statistics
| Category | # Questions |
|------------------------------|-------------|
| **Prerequisites** | |
| None | 59,909 |
| 1 Prerequisite Test | 12,316 |
| 2 Prerequisite Tests | 776 |
| 2+ Years of Experience | 65,135 |
| 4+ Years of Experience | 98,678 |
| 9+ Years of Experience | 6,963 |
| **Question Type** | |
| Positive | 207,030 |
| Negation | 36,777 |
| **Split** | |
| Train | 208,522 |
| Validation | 225 |
| Test | 35,030 |
| **Total** | 243,777 |
### Categories
To reimplement the categories in the paper, refer to the following:
```
supercategories = {
"accounting": "HUMSS",
"agricultural_sciences": "Other",
"aviation_engineering_and_maintenance": "Applied Science",
"biology": "STEM",
"chemical_engineering": "STEM",
"chemistry": "STEM",
"civil_engineering": "STEM",
"computer_science": "STEM",
"construction": "Other",
"criminal_law": "HUMSS",
"ecology": "STEM",
"economics": "HUMSS",
"education": "HUMSS",
"electrical_engineering": "STEM",
"electronics_engineering": "Applied Science",
"energy_management": "Applied Science",
"environmental_science": "Applied Science",
"fashion": "Other",
"food_processing": "Other",
"gas_technology_and_engineering": "Applied Science",
"geomatics": "Applied Science",
"health": "Other",
"industrial_engineer": "Applied Science",
"information_technology": "STEM",
"interior_architecture_and_design": "Other",
"law": "HUMSS",
"machine_design_and_manufacturing": "Applied Science",
"management": "HUMSS",
"maritime_engineering": "Applied Science",
"marketing": "Other",
"materials_engineering": "STEM",
"mechanical_engineering": "STEM",
"nondestructive_testing": "Applied Science",
"patent": "Other",
"political_science_and_sociology": "HUMSS",
"psychology": "HUMSS",
"public_safety": "Other",
"railway_and_automotive_engineering": "Applied Science",
"real_estate": "Other",
"refrigerating_machinery": "Other",
"social_welfare": "HUMSS",
"taxation": "HUMSS",
"telecommunications_and_wireless_technology": "Applied Science",
"korean_history": "HUMSS",
"math": "STEM"
}
```
### Point of Contact
For any questions contact us via the following email:)
```
spthsrbwls123@yonsei.ac.kr
``` |
ms3c/swahili-common-voices-africas-talking | ---
license: lgpl-3.0
---
|
datahrvoje/twitter_dataset_1712724532 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 18932
num_examples: 48
download_size: 13604
dataset_size: 18932
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
heliosprime/twitter_dataset_1713097528 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9849
num_examples: 24
download_size: 12450
dataset_size: 9849
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713097528"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jtatman/jigsaw_hatebert | ---
dataset_info:
features:
- name: text
dtype: string
- name: text_masked
dtype: string
- name: text_replaced
list:
- name: score
dtype: float64
- name: sequence
dtype: string
- name: token
dtype: int64
- name: token_str
dtype: string
- name: asian
dtype: string
- name: atheist
dtype: string
- name: bisexual
dtype: string
- name: black
dtype: string
- name: buddhist
dtype: string
- name: christian
dtype: string
- name: female
dtype: string
- name: heterosexual
dtype: string
- name: hindu
dtype: string
- name: homosexual_gay_or_lesbian
dtype: string
- name: intellectual_or_learning_disability
dtype: string
- name: jewish
dtype: string
- name: latino
dtype: string
- name: male
dtype: string
- name: muslim
dtype: string
- name: other_disability
dtype: string
- name: other_gender
dtype: string
- name: other_race_or_ethnicity
dtype: string
- name: other_religion
dtype: string
- name: other_sexual_orientation
dtype: string
- name: physical_disability
dtype: string
- name: psychiatric_or_mental_illness
dtype: string
- name: transgender
dtype: string
- name: white
dtype: string
- name: funny
dtype: string
- name: wow
dtype: string
- name: sad
dtype: string
- name: likes
dtype: string
- name: disagree
dtype: string
- name: target
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 236287827
num_examples: 110000
download_size: 83975623
dataset_size: 236287827
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "jigsaw_hatebert"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-84000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 974178
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
YosefLab-classes/lung_krasnow | ---
license:
- unknown
converted_from: zenodo
zenodo_id: '7904640'
---
# Dataset Card for A molecular cell atlas of the human lung from single cell RNA sequencing
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://zenodo.org/record/7904640
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
<p>https://cellxgene.cziscience.com/collections/5d445965-6f1a-4b68-ba3a-b8f765155d3a</p>
<p>https://www.nature.com/articles/s41586-020-2922-4</p>
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was shared by Travaglini et al
### Licensing Information
The license for this dataset is https://creativecommons.org/licenses/by/4.0/legalcode
### Citation Information
```bibtex
@dataset{travaglini_et_al_2020_7904640,
author = {Travaglini et al},
title = {{A molecular cell atlas of the human lung from
single cell RNA sequencing}},
month = nov,
year = 2020,
publisher = {Zenodo},
doi = {10.5281/zenodo.7904640},
url = {https://doi.org/10.5281/zenodo.7904640}
}
```
### Contributions
[More Information Needed] |
result-kand2-sdxl-wuerst-karlo/812b079e | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 164
num_examples: 10
download_size: 1319
dataset_size: 164
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "812b079e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mii-community/UsenetArchiveIT-conversations | ---
license: apache-2.0
task_categories:
- text-generation
language:
- it
tags:
- conversations
- human
size_categories:
- 1B<n<10B
---
# Conversational Usenet Archive IT Dataset 🇮🇹
## Description
### Dataset Content
This dataset is a filtered version from the [Usenet dataset](https://huggingface.co/datasets/mrinaldi/UsenetArchiveIT) that contains posts from Italian language newsgroups belonging to the `it` and `italia` hierarchies. The data has been archived and converted to the Parquet format for easy processing. All posts with more the one message has been grouped in conversations
This dataset contributes to the [mii-community](https://huggingface.co/mii-community) project, aimed at advancing the creation of Italian open-source Language Models (LLMs).🇮🇹 🤖
### Descriptive Statistics
This dataset contains 9,161,482 conversations of about 539 newsgroups, in about 18GB
### Languages
The dataset should contain only Italian language posts, but it is possible that some posts are in other languages. The dataset has not been language filtered, as post were expected to be in Italian.
## Dataset Structure
### Features
Each record in the dataset has the following fields:
- `title`: The title of the post.
- `id`: The unique identifier of the post.
- `original_url`: The URL of the original post on Google Groups.
- `newsgroup`: The name of the newsgroup the post belongs to.
- `messages`: An array of messages in the form of [ { 'role': user, 'content' : '.....' }, { 'role' : 'assistant' , 'content' : '.......' ].
This repo contains the dataset in the Parquet format.
## Additional Information
### Dataset Curators
This dataset was curated by Hugging Face user [giux78](https://huggingface.co/giux78) but is only a filter and grouped version of [Usenet dataset](https://huggingface.co/datasets/mrinaldi/UsenetArchiveIT) released by
[manalog](https://huggingface.co/manalog) and [ruggsea](https://huggingface.co/ruggsea), as part of the [mii-community](https://huggingface.co/mii-community) dataset creation effort.
### Dataset rationale
The dataset was created as part of a bigger effort to create various high-quality datasets of native Italian text, with the aim of aiding the development of Italian open-source LLMs.
## Usage
You can load the dataset directly from datasets using the `load_dataset` function. Here's an example:
```python
from datasets import load_dataset
dataset = load_dataset("mii-community/UsenetArchiveIT-conversations")
``` |
hynky/contract-summ | ---
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: instruction
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1052306
num_examples: 749
download_size: 548616
dataset_size: 1052306
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "contract-summ"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carnival13/xlmr_int_hard_curr_trn_ep2 | ---
dataset_info:
features:
- name: domain_label
dtype: int64
- name: pass_label
dtype: int64
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 284672773
num_examples: 226100
download_size: 80604529
dataset_size: 284672773
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "xlmr_int_hard_curr_trn_ep2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
beanham/medsum | ---
task_categories:
- summarization
language:
- en
tags:
- medical
size_categories:
- 1K<n<10K
---
This dataset comes from the EACL 2023 paper: An Empirical Study of Clinical Note Generation from Doctor-Patient Encounters
https://github.com/abachaa/MTS-Dialog/tree/main/Main-Dataset |
LazarusNLP/multilingual-NLI-26lang-2mil7-id | ---
dataset_info:
features:
- name: premise_original
dtype: string
- name: hypothesis_original
dtype: string
- name: label
dtype: int64
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: subset
dtype: string
splits:
- name: train
num_bytes: 56437736
num_examples: 105000
download_size: 35187813
dataset_size: 56437736
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
charchits7/horse2zebra | ---
license: apache-2.0
task_categories:
- image-to-image
language:
- en
--- |
Cloud-Ron/regularization | ---
tags:
- not-for-all-audiences
--- |
dongyoung4091/hh-rlhf_with_features_rx_add | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: helpfulness_chosen
dtype: int64
- name: helpfulness_rejected
dtype: int64
- name: specificity_chosen
dtype: int64
- name: specificity_rejected
dtype: int64
- name: intent_chosen
dtype: int64
- name: intent_rejected
dtype: int64
- name: factuality_chosen
dtype: int64
- name: factuality_rejected
dtype: int64
- name: easy-to-understand_chosen
dtype: int64
- name: easy-to-understand_rejected
dtype: int64
- name: relevance_chosen
dtype: int64
- name: relevance_rejected
dtype: int64
- name: readability_chosen
dtype: int64
- name: readability_rejected
dtype: int64
- name: enough-detail_chosen
dtype: int64
- name: enough-detail_rejected
dtype: int64
- name: biased:_chosen
dtype: int64
- name: biased:_rejected
dtype: int64
- name: fail-to-consider-individual-preferences_chosen
dtype: int64
- name: fail-to-consider-individual-preferences_rejected
dtype: int64
- name: repetetive_chosen
dtype: int64
- name: repetetive_rejected
dtype: int64
- name: fail-to-consider-context_chosen
dtype: int64
- name: fail-to-consider-context_rejected
dtype: int64
- name: too-long_chosen
dtype: int64
- name: too-long_rejected
dtype: int64
- name: model_A_chosen
dtype: float64
- name: model_A_rejected
dtype: float64
- name: model_B_chosen
dtype: float64
- name: model_B_rejected
dtype: float64
- name: external_rm1_chosen
dtype: float64
- name: external_rm1_rejected
dtype: float64
splits:
- name: train
num_bytes: 9039445
num_examples: 9574
- name: test
num_bytes: 9010732
num_examples: 9574
download_size: 9248458
dataset_size: 18050177
---
# Dataset Card for "hh-rlhf_with_features_rx_add"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JCAI2000/LargerImagesLabelled | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 513933217.0
num_examples: 42
download_size: 182096737
dataset_size: 513933217.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "LargerImagesLabelled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/summarize_from_feedback_tldr_3_filtered_oai_preprocessing_pythia | ---
dataset_info:
features:
- name: id
dtype: string
- name: subreddit
dtype: string
- name: title
dtype: string
- name: post
dtype: string
- name: summary
dtype: string
- name: query_token
sequence: int64
- name: query
dtype: string
- name: reference_response
dtype: string
- name: reference_response_token
sequence: int64
splits:
- name: train
num_bytes: 886229543
num_examples: 116722
- name: validation
num_bytes: 48966797
num_examples: 6447
- name: test
num_bytes: 49800881
num_examples: 6553
download_size: 338995010
dataset_size: 984997221
---
# Dataset Card for "summarize_from_feedback_tldr_3_filtered_oai_preprocessing_pythia"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_huggyllama__llama-30b | ---
pretty_name: Evaluation run of huggyllama/llama-30b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [huggyllama/llama-30b](https://huggingface.co/huggyllama/llama-30b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 122 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggyllama__llama-30b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T23:44:55.901768](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-30b/blob/main/results_2023-09-16T23-44-55.901768.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298701,\n \"f1\": 0.06332634228187943,\n\
\ \"f1_stderr\": 0.0013742294190200051,\n \"acc\": 0.47445656434133393,\n\
\ \"acc_stderr\": 0.010516415781576863\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298701,\n\
\ \"f1\": 0.06332634228187943,\n \"f1_stderr\": 0.0013742294190200051\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14859742228961334,\n \
\ \"acc_stderr\": 0.009797503180527876\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625849\n\
\ }\n}\n```"
repo_url: https://huggingface.co/huggyllama/llama-30b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|arc:challenge|25_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|arc:challenge|25_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T23_44_55.901768
path:
- '**/details_harness|drop|3_2023-09-16T23-44-55.901768.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T23-44-55.901768.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T23_44_55.901768
path:
- '**/details_harness|gsm8k|5_2023-09-16T23-44-55.901768.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T23-44-55.901768.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hellaswag|10_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hellaswag|10_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T23_44_55.901768
path:
- '**/details_harness|winogrande|5_2023-09-16T23-44-55.901768.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T23-44-55.901768.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:06:09.731721.parquet'
- config_name: results
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- results_2023-08-19T23:03:51.753289.parquet
- split: 2023_08_23T17_40_29.405074
path:
- results_2023-08-23T17:40:29.405074.parquet
- split: 2023_08_28T20_06_09.731721
path:
- results_2023-08-28T20:06:09.731721.parquet
- split: 2023_09_16T23_44_55.901768
path:
- results_2023-09-16T23-44-55.901768.parquet
- split: latest
path:
- results_2023-09-16T23-44-55.901768.parquet
---
# Dataset Card for Evaluation run of huggyllama/llama-30b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/huggyllama/llama-30b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [huggyllama/llama-30b](https://huggingface.co/huggyllama/llama-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggyllama__llama-30b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T23:44:55.901768](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-30b/blob/main/results_2023-09-16T23-44-55.901768.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298701,
"f1": 0.06332634228187943,
"f1_stderr": 0.0013742294190200051,
"acc": 0.47445656434133393,
"acc_stderr": 0.010516415781576863
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298701,
"f1": 0.06332634228187943,
"f1_stderr": 0.0013742294190200051
},
"harness|gsm8k|5": {
"acc": 0.14859742228961334,
"acc_stderr": 0.009797503180527876
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625849
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BPHoops/futurama_fry | ---
license: apache-2.0
---
|
CyberHarem/pozemka_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of pozemka/パゼオンカ/鸿雪 (Arknights)
This is the dataset of pozemka/パゼオンカ/鸿雪 (Arknights), containing 376 images and their tags.
The core tags of this character are `animal_ears, pink_hair, long_hair, wolf_ears, wolf_girl, breasts, pink_eyes, very_long_hair, animal_ear_fluff, hair_ornament, braid, hair_intakes, large_breasts, extra_ears, medium_breasts, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 376 | 811.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pozemka_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 376 | 649.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pozemka_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 967 | 1.24 GiB | [Download](https://huggingface.co/datasets/CyberHarem/pozemka_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pozemka_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_bra, black_gloves, cleavage, high-waist_skirt, looking_at_viewer, red_skirt, simple_background, solo, white_background, elbow_gloves, bare_shoulders, hair_between_eyes, cowboy_shot, hand_up, upper_body |
| 1 | 5 |  |  |  |  |  | 1girl, black_bra, black_gloves, cleavage, cowboy_shot, elbow_gloves, fingerless_gloves, hand_up, high-waist_skirt, holding_envelope, looking_at_viewer, red_skirt, solo, simple_background, thigh_strap, white_background, bare_shoulders, hair_between_eyes, closed_mouth, parted_lips |
| 2 | 9 |  |  |  |  |  | 1girl, black_bra, black_gloves, cleavage, high-waist_skirt, holding_envelope, looking_at_viewer, outdoors, red_skirt, solo, blue_sky, day, bare_shoulders, cowboy_shot, elbow_gloves, thigh_strap, cloud, fingerless_gloves, hand_up, flower |
| 3 | 7 |  |  |  |  |  | 1girl, cleavage, holding_cup, long_sleeves, mug, official_alternate_costume, solo, black_dress, looking_at_viewer, smile, white_hairband, upper_body |
| 4 | 5 |  |  |  |  |  | 1girl, black_dress, cleavage, long_sleeves, looking_at_viewer, simple_background, sitting, solo, white_pantyhose, boots, thigh_strap, white_background, white_footwear, official_alternate_costume, wolf_tail, arm_support, jacket, mug, smile |
| 5 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, sex, solo_focus, navel, nipples, penis, pussy, vaginal, wolf_tail, bar_censor, black_gloves, completely_nude, elbow_gloves, thigh_strap, clenched_teeth, girl_on_top, looking_at_viewer, pov, spread_legs, straddling |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_bra | black_gloves | cleavage | high-waist_skirt | looking_at_viewer | red_skirt | simple_background | solo | white_background | elbow_gloves | bare_shoulders | hair_between_eyes | cowboy_shot | hand_up | upper_body | fingerless_gloves | holding_envelope | thigh_strap | closed_mouth | parted_lips | outdoors | blue_sky | day | cloud | flower | holding_cup | long_sleeves | mug | official_alternate_costume | black_dress | smile | white_hairband | sitting | white_pantyhose | boots | white_footwear | wolf_tail | arm_support | jacket | 1boy | blush | hetero | sex | solo_focus | navel | nipples | penis | pussy | vaginal | bar_censor | completely_nude | clenched_teeth | girl_on_top | pov | spread_legs | straddling |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:---------------|:-----------|:-------------------|:--------------------|:------------|:--------------------|:-------|:-------------------|:---------------|:-----------------|:--------------------|:--------------|:----------|:-------------|:--------------------|:-------------------|:--------------|:---------------|:--------------|:-----------|:-----------|:------|:--------|:---------|:--------------|:---------------|:------|:-----------------------------|:--------------|:--------|:-----------------|:----------|:------------------|:--------|:-----------------|:------------|:--------------|:---------|:-------|:--------|:---------|:------|:-------------|:--------|:----------|:--------|:--------|:----------|:-------------|:------------------|:-----------------|:--------------|:------|:--------------|:-------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | X | | X | X | | X | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | X | | X | | | X | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | X | | X | | X | X | X | | | | | | | | | X | | | | | | | | | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | | | X | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
stoddur/referral_commands | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1544000
num_examples: 1000
- name: eval
num_bytes: 1544000
num_examples: 1000
download_size: 188692
dataset_size: 3088000
---
# Dataset Card for "referral_commands"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OnAnOrange/claude_explanation_augmented_pun_detection | ---
dataset_info:
features:
- name: ID
dtype: string
- name: EXPL
dtype: string
- name: TEXT
dtype: string
- name: CHOICE
dtype: int64
splits:
- name: train
num_bytes: 579003.1761412575
num_examples: 1625
- name: eval
num_bytes: 82663.83807062877
num_examples: 232
- name: test
num_bytes: 165683.9857881137
num_examples: 465
download_size: 430354
dataset_size: 827351.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
- split: test
path: data/test-*
---
|
prit1205/call_home_sample | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 235351737.0
num_examples: 8
download_size: 186021031
dataset_size: 235351737.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "call_home_sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hbfreed/Picklebot-50K | ---
license: mit
task_categories:
- video-classification
tags:
- baseball
- sports
- video-classification
- computer-vision
size_categories:
- 10K<n<100K
---
# Dataset Card for Picklebot50k
<!-- Provide a quick summary of the dataset. -->
50 thousand video clips of balls and strikes from MLB games from the 2016 season through the 2022 season.

## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
The dataset consists of roughly 50 thousand video clips of balls and strikes in .mp4 format, resized to 224x224 resolution.
The calculated standard deviation and mean for the dataset are
std: (0.2104, 0.1986, 0.1829)
mean: (0.3939, 0.3817, 0.3314).
- **Curated by:** Henry Freed
- **License:** MIT
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** The original project that this dataset was compiled for can be found here on [github](https://github.com/hbfreed/Picklebot).
- **Demo:** The demo for a neural net trained on this dataset can be found here on [huggingface spaces](https://huggingface.co/spaces/hbfreed/picklebot_demo).
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
The dataset was originally collected to call balls and strikes using neural networks. There are many other potential use cases, but they would almost certainly require relabeling. For more videos and more complete information about each pitch, see [Picklebot-2M](https://huggingface.co/datasets/hbfreed/Picklebot-2M).
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
The dataset is structured as .tar files of the train, val, and test splits. The labels are contained in .csv files. The .csvs are structured as follows:
"filename.mp4",label
where the label is 0 for balls and 1 for strikes.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
The source data were scraped from Baseball Savant's [Statcast Search](https://baseballsavant.mlb.com/statcast_search). It's a pretty powerful search page, and a lot of fun to play around with.
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
After downloading the videos, they were cropped from 1280x720 at 60fps to the middle 600x600 pixels at 60fps. Finally, they were downsampled to 224x224 resolution at 15 fps (this can all be done using one ffmpeg command). Some of the longer clips where there was a lot of noise (shots of the crowd, instant replays, etc.) were trimmed (mostly by hand) down to a more manageable length.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[Baseball Savant](https://baseballsavant.mlb.com/) and MLB/the broadcasters (whoever it is) originally created the videos.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
It's important to note that only balls and called strikes were collected. No swinging strikes, foul balls, hit by pitches, or anything else are included in the dataset. Additionally, most pitchers and batters are right handed, and nothing was done to try and balance that in this dataset. |
fia24/sentence_lemma_10000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: translation
struct:
- name: en
dtype: string
- name: fr
dtype: string
splits:
- name: train
num_bytes: 3274843
num_examples: 8000
- name: test
num_bytes: 1121104
num_examples: 3074
download_size: 1884093
dataset_size: 4395947
---
# Dataset Card for "sentence_lemma_10000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
indicbench/truthfulqa_te | ---
dataset_info:
- config_name: default
features:
- name: _data_files
list:
- name: filename
dtype: string
- name: _fingerprint
dtype: string
- name: _format_columns
dtype: 'null'
- name: _format_kwargs
dtype: string
- name: _format_type
dtype: 'null'
- name: _output_all_columns
dtype: bool
- name: _split
dtype: 'null'
splits:
- name: train
num_bytes: 119
num_examples: 2
download_size: 3715
dataset_size: 119
- config_name: generation
features:
- name: type
dtype: string
- name: category
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
- name: source
dtype: string
splits:
- name: validation
num_bytes: 1125011
num_examples: 817
download_size: 354836
dataset_size: 1125011
- config_name: multiple_choice
features:
- name: question
dtype: string
- name: mc1_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int64
- name: mc2_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int64
splits:
- name: validation
num_bytes: 1577019
num_examples: 817
download_size: 459908
dataset_size: 1577019
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: generation
data_files:
- split: validation
path: generation/validation-*
- config_name: multiple_choice
data_files:
- split: validation
path: multiple_choice/validation-*
---
|
Yeva/arm-summary | ---
language:
- hy
---
annotations_creators:
- other
language_creators:
- other
languages:
- hy-AM
licenses:
- unknown
multilinguality:
- monolingual
pretty_name: arm-sum
size_categories:
- unknown
source_datasets:
- original
task_categories:
- conditional-text-generation
task_ids:
- summarization |
MajdTannous/Test2 | ---
license: other
---
|
bastistrauss/DE_Plain | ---
license: apache-2.0
language:
- de
- en
--- |
liuyanchen1015/VALUE_stsb_got | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 8486
num_examples: 44
- name: test
num_bytes: 4738
num_examples: 34
- name: train
num_bytes: 10468
num_examples: 68
download_size: 24423
dataset_size: 23692
---
# Dataset Card for "VALUE_stsb_got"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1713208776 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 139970
num_examples: 369
download_size: 77880
dataset_size: 139970
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Hiraishin/ESCO-FULL-SKILL-EMBEDDING-3072 | ---
license: apache-2.0
---
|
Parth1612/pp_distilbert_ft_tweet_irony | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': non_irony
'1': irony
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 7608803
num_examples: 2862
- name: test
num_bytes: 2089209
num_examples: 784
- name: validation
num_bytes: 2538457
num_examples: 955
download_size: 569421
dataset_size: 12236469
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
Quay1k/Differentiation-Accommodations | ---
license: apache-2.0
---
|
yorius96/manga_book_classifier | ---
license: apache-2.0
---
|
lmms-lab/RefCOCO | ---
dataset_info:
features:
- name: question_id
dtype: string
- name: image
dtype: image
- name: question
dtype: string
- name: answer
sequence: string
- name: segmentation
sequence: float32
- name: bbox
sequence: float32
- name: iscrowd
dtype: int8
- name: file_name
dtype: string
splits:
- name: val
num_bytes: 1548717880.0
num_examples: 8811
- name: test
num_bytes: 876787122.0
num_examples: 5000
- name: testA
num_bytes: 340830323.0
num_examples: 1975
- name: testB
num_bytes: 317959580.0
num_examples: 1810
download_size: 2278337287
dataset_size: 3084294905.0
configs:
- config_name: default
data_files:
- split: val
path: data/val-*
- split: test
path: data/test-*
- split: testA
path: data/testA-*
- split: testB
path: data/testB-*
---
<p align="center" width="100%">
<img src="https://i.postimg.cc/g0QRgMVv/WX20240228-113337-2x.png" width="100%" height="80%">
</p>
# Large-scale Multi-modality Models Evaluation Suite
> Accelerating the development of large-scale multi-modality models (LMMs) with `lmms-eval`
🏠 [Homepage](https://lmms-lab.github.io/) | 📚 [Documentation](docs/README.md) | 🤗 [Huggingface Datasets](https://huggingface.co/lmms-lab)
# This Dataset
This is a formatted version of [RefCOCO](https://github.com/lichengunc/refer). It is used in our `lmms-eval` pipeline to allow for one-click evaluations of large multi-modality models.
```
@inproceedings{kazemzadeh-etal-2014-referitgame,
title = "{R}efer{I}t{G}ame: Referring to Objects in Photographs of Natural Scenes",
author = "Kazemzadeh, Sahar and
Ordonez, Vicente and
Matten, Mark and
Berg, Tamara",
editor = "Moschitti, Alessandro and
Pang, Bo and
Daelemans, Walter",
booktitle = "Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing ({EMNLP})",
month = oct,
year = "2014",
address = "Doha, Qatar",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/D14-1086",
doi = "10.3115/v1/D14-1086",
pages = "787--798",
}
``` |
CyberHarem/luna_kusami_areyoutheonlyonewholovesme | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Luna Kusami/草見月 (Are you the only one who loves me?)
This is the dataset of Luna Kusami/草見月 (Are you the only one who loves me?), containing 131 images and their tags.
The core tags of this character are `blue_hair, braid, blue_eyes, short_hair, hair_ornament, hair_flower, ribbon, side_braid, neck_ribbon, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 131 | 81.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_kusami_areyoutheonlyonewholovesme/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 131 | 81.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_kusami_areyoutheonlyonewholovesme/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 220 | 128.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_kusami_areyoutheonlyonewholovesme/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/luna_kusami_areyoutheonlyonewholovesme',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, cloud, day, flower, solo, outdoors, school_uniform, single_braid, blue_sky, looking_at_viewer, portrait |
| 1 | 13 |  |  |  |  |  | 1girl, flower, looking_at_viewer, solo, single_braid, school_uniform, upper_body, blush, open_mouth |
| 2 | 8 |  |  |  |  |  | 2girls, blush, flower, pink_hair, school_uniform, solo_focus, long_hair, looking_at_viewer, 3girls |
| 3 | 9 |  |  |  |  |  | plaid_skirt, school_uniform, short_sleeves, 1girl, red_skirt, pleated_skirt, bookshelf, medium_breasts, white_shirt, flower, holding, single_braid, arm_up, library, long_hair, solo_focus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cloud | day | flower | solo | outdoors | school_uniform | single_braid | blue_sky | looking_at_viewer | portrait | upper_body | blush | open_mouth | 2girls | pink_hair | solo_focus | long_hair | 3girls | plaid_skirt | short_sleeves | red_skirt | pleated_skirt | bookshelf | medium_breasts | white_shirt | holding | arm_up | library |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:------|:---------|:-------|:-----------|:-----------------|:---------------|:-----------|:--------------------|:-----------|:-------------|:--------|:-------------|:---------|:------------|:-------------|:------------|:---------|:--------------|:----------------|:------------|:----------------|:------------|:-----------------|:--------------|:----------|:---------|:----------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | | | X | X | | X | X | | X | | X | X | X | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | | | | X | | | X | | | X | | | X | | X | X | X | X | X | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | | X | | | X | X | | | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X |
|
MarPaama/iconset | ---
license: apache-2.0
---
|
freshpearYoon/v3_train_free_concat_36 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842500200
num_examples: 2500
download_size: 1823871729
dataset_size: 3842500200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/data-standardized_cluster_21_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 9415114
num_examples: 4730
download_size: 3969543
dataset_size: 9415114
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_21_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_diffnamehard__Mistral-CatMacaroni-slerp-uncensored | ---
pretty_name: Evaluation run of diffnamehard/Mistral-CatMacaroni-slerp-uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [diffnamehard/Mistral-CatMacaroni-slerp-uncensored](https://huggingface.co/diffnamehard/Mistral-CatMacaroni-slerp-uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_diffnamehard__Mistral-CatMacaroni-slerp-uncensored\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T10:24:03.109443](https://huggingface.co/datasets/open-llm-leaderboard/details_diffnamehard__Mistral-CatMacaroni-slerp-uncensored/blob/main/results_2023-12-29T10-24-03.109443.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6281191061942026,\n\
\ \"acc_stderr\": 0.03270826089766551,\n \"acc_norm\": 0.6305168591291603,\n\
\ \"acc_norm_stderr\": 0.03336918504627038,\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5687346538981385,\n\
\ \"mc2_stderr\": 0.015485344488808075\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5998293515358362,\n \"acc_stderr\": 0.014317197787809167,\n\
\ \"acc_norm\": 0.6424914675767918,\n \"acc_norm_stderr\": 0.014005494275916576\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6401115315674168,\n\
\ \"acc_stderr\": 0.004789865379084514,\n \"acc_norm\": 0.8408683529177454,\n\
\ \"acc_norm_stderr\": 0.0036505121583062755\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.026729499068349958,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.026729499068349958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908234,\n \
\ \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200144,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200144\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\"\
: 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.0140369458503814,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.0140369458503814\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39217877094972065,\n\
\ \"acc_stderr\": 0.01632906107320745,\n \"acc_norm\": 0.39217877094972065,\n\
\ \"acc_norm_stderr\": 0.01632906107320745\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621344,\n\
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621344\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n\
\ \"acc_stderr\": 0.012656810383983965,\n \"acc_norm\": 0.4335071707953064,\n\
\ \"acc_norm_stderr\": 0.012656810383983965\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.0193733324207245,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.0193733324207245\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333335,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333335\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826368,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826368\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5687346538981385,\n\
\ \"mc2_stderr\": 0.015485344488808075\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936652\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5610310841546626,\n \
\ \"acc_stderr\": 0.013669500369036204\n }\n}\n```"
repo_url: https://huggingface.co/diffnamehard/Mistral-CatMacaroni-slerp-uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|arc:challenge|25_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|gsm8k|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hellaswag|10_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T10-24-03.109443.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T10-24-03.109443.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- '**/details_harness|winogrande|5_2023-12-29T10-24-03.109443.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T10-24-03.109443.parquet'
- config_name: results
data_files:
- split: 2023_12_29T10_24_03.109443
path:
- results_2023-12-29T10-24-03.109443.parquet
- split: latest
path:
- results_2023-12-29T10-24-03.109443.parquet
---
# Dataset Card for Evaluation run of diffnamehard/Mistral-CatMacaroni-slerp-uncensored
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [diffnamehard/Mistral-CatMacaroni-slerp-uncensored](https://huggingface.co/diffnamehard/Mistral-CatMacaroni-slerp-uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_diffnamehard__Mistral-CatMacaroni-slerp-uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T10:24:03.109443](https://huggingface.co/datasets/open-llm-leaderboard/details_diffnamehard__Mistral-CatMacaroni-slerp-uncensored/blob/main/results_2023-12-29T10-24-03.109443.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6281191061942026,
"acc_stderr": 0.03270826089766551,
"acc_norm": 0.6305168591291603,
"acc_norm_stderr": 0.03336918504627038,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.5687346538981385,
"mc2_stderr": 0.015485344488808075
},
"harness|arc:challenge|25": {
"acc": 0.5998293515358362,
"acc_stderr": 0.014317197787809167,
"acc_norm": 0.6424914675767918,
"acc_norm_stderr": 0.014005494275916576
},
"harness|hellaswag|10": {
"acc": 0.6401115315674168,
"acc_stderr": 0.004789865379084514,
"acc_norm": 0.8408683529177454,
"acc_norm_stderr": 0.0036505121583062755
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.026729499068349958,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.026729499068349958
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.024635549163908234,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.024635549163908234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200144,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200144
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.0345727283691767,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.0345727283691767
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.0140369458503814,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.0140369458503814
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39217877094972065,
"acc_stderr": 0.01632906107320745,
"acc_norm": 0.39217877094972065,
"acc_norm_stderr": 0.01632906107320745
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983965,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983965
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.0193733324207245,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.0193733324207245
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333335,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333335
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826368,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826368
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.5687346538981385,
"mc2_stderr": 0.015485344488808075
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936652
},
"harness|gsm8k|5": {
"acc": 0.5610310841546626,
"acc_stderr": 0.013669500369036204
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
avsolatorio/medi-data | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
- name: task_name
dtype: string
- name: query_instruct
dtype: string
- name: pos_instruct
dtype: string
- name: neg_instruct
dtype: string
splits:
- name: train
num_bytes: 2555303114
num_examples: 1435000
download_size: 1231001259
dataset_size: 2555303114
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# MEDI dataset
This dataset was used in the paper GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning. Refer to https://arxiv.org/abs/2402.16829 for details.
The original dataset comes from the paper "One Embedder, Any Task: Instruction-Finetuned Text Embeddings" (https://arxiv.org/abs/2212.09741), which was used to train the INSTRUCTOR family of models (GitHub: https://github.com/xlang-ai/instructor-embedding).
The code for processing and publishing the raw data to HuggingFace Hub is available at https://github.com/avsolatorio/GISTEmbed.
## Citation
**GISTEmbed**
```
@article{solatorio2024gistembed,
title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
author={Aivin V. Solatorio},
journal={arXiv preprint arXiv:2402.16829},
year={2024},
URL={https://arxiv.org/abs/2402.16829}
eprint={2402.16829},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
**INSTRUCTOR**
```
@inproceedings{INSTRUCTOR,
title={One Embedder, Any Task: Instruction-Finetuned Text Embeddings},
author={Su, Hongjin and Shi, Weijia and Kasai, Jungo and Wang, Yizhong and Hu, Yushi and Ostendorf, Mari and Yih, Wen-tau and Smith, Noah A. and Zettlemoyer, Luke and Yu, Tao},
url={https://arxiv.org/abs/2212.09741},
year={2022},
}
``` |
clarin-knext/arguana-pl-qrels | ---
language:
- pl
---
Part of **BEIR-PL: Zero Shot Information Retrieval Benchmark for the Polish Language**.
Link to arxiv: https://arxiv.org/pdf/2305.19840.pdf
Contact: konrad.wojtasik@pwr.edu.pl |
VivendoDigital/belebele-chat-ita | ---
license: apache-2.0
dataset_info:
features:
- name: chat
list:
- name: content
dtype: string
- name: role
dtype: string
- name: formatted_chat
dtype: string
splits:
- name: train
num_bytes: 1297382
num_examples: 900
download_size: 498235
dataset_size: 1297382
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Hantao/ChemReactionImageRE | ---
license: gpl-3.0
---
|
adityarra07/sollingen_data | ---
dataset_info:
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 1174246362.25
num_examples: 4638
download_size: 1167082408
dataset_size: 1174246362.25
---
# Dataset Card for "sollingen_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gummybear05/E10_Yfreq_speed | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sample_rate
dtype: int64
- name: text
dtype: string
- name: scriptId
dtype: int64
- name: fileNm
dtype: string
- name: recrdTime
dtype: float64
- name: recrdQuality
dtype: int64
- name: recrdDt
dtype: string
- name: scriptSetNo
dtype: string
- name: recrdEnvrn
dtype: string
- name: colctUnitCode
dtype: string
- name: cityCode
dtype: string
- name: recrdUnit
dtype: string
- name: convrsThema
dtype: string
- name: gender
dtype: string
- name: recorderId
dtype: string
- name: age
dtype: int64
splits:
- name: train
num_bytes: 11044951918
num_examples: 12401
download_size: 7866642397
dataset_size: 11044951918
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.