datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_Locutusque__TinyMistral-248M-Instruct | ---
pretty_name: Evaluation run of Locutusque/TinyMistral-248M-Instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/TinyMistral-248M-Instruct](https://huggingface.co/Locutusque/TinyMistral-248M-Instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__TinyMistral-248M-Instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-06T16:40:16.358250](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248M-Instruct/blob/main/results_2023-12-06T16-40-16.358250.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.251231011993023,\n\
\ \"acc_stderr\": 0.030796208549621222,\n \"acc_norm\": 0.252037607935898,\n\
\ \"acc_norm_stderr\": 0.03161677046697385,\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474203,\n \"mc2\": 0.419357246718368,\n\
\ \"mc2_stderr\": 0.015180505292617188\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19965870307167236,\n \"acc_stderr\": 0.011681625756888674,\n\
\ \"acc_norm\": 0.2431740614334471,\n \"acc_norm_stderr\": 0.012536554144587089\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27165903206532566,\n\
\ \"acc_stderr\": 0.004439059440526251,\n \"acc_norm\": 0.27524397530372435,\n\
\ \"acc_norm_stderr\": 0.004457243336616505\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.03712537833614865,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.03712537833614865\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.035541803680256896,\n\
\ \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.035541803680256896\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n\
\ \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \
\ \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641143,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641143\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714506,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714506\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238167,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238167\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948368,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948368\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27419354838709675,\n\
\ \"acc_stderr\": 0.025378139970885203,\n \"acc_norm\": 0.27419354838709675,\n\
\ \"acc_norm_stderr\": 0.025378139970885203\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.034531318018854146,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.034531318018854146\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2474747474747475,\n \"acc_stderr\": 0.030746300742124505,\n \"\
acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.030746300742124505\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.03275264467791516,\n\
\ \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.03275264467791516\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204426,\n\
\ \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204426\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882367,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882367\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.033367670865679766,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.033367670865679766\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26422018348623855,\n \"acc_stderr\": 0.018904164171510196,\n \"\
acc_norm\": 0.26422018348623855,\n \"acc_norm_stderr\": 0.018904164171510196\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.028353212866863448,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.028353212866863448\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29411764705882354,\n \"acc_stderr\": 0.03198001660115069,\n \"\
acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.03198001660115069\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.29535864978902954,\n \"acc_stderr\": 0.0296963387134229,\n \
\ \"acc_norm\": 0.29535864978902954,\n \"acc_norm_stderr\": 0.0296963387134229\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2600896860986547,\n\
\ \"acc_stderr\": 0.029442495585857476,\n \"acc_norm\": 0.2600896860986547,\n\
\ \"acc_norm_stderr\": 0.029442495585857476\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159462,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159462\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.19008264462809918,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.19008264462809918,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n\
\ \"acc_stderr\": 0.0282863240755644,\n \"acc_norm\": 0.24786324786324787,\n\
\ \"acc_norm_stderr\": 0.0282863240755644\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29118773946360155,\n\
\ \"acc_stderr\": 0.01624608706970139,\n \"acc_norm\": 0.29118773946360155,\n\
\ \"acc_norm_stderr\": 0.01624608706970139\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.1994219653179191,\n \"acc_stderr\": 0.02151190065425255,\n\
\ \"acc_norm\": 0.1994219653179191,\n \"acc_norm_stderr\": 0.02151190065425255\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.025457756696667895,\n\
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.025457756696667895\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21864951768488747,\n\
\ \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.21864951768488747,\n\
\ \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.02624492034984301,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.02624492034984301\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23272490221642764,\n\
\ \"acc_stderr\": 0.010792595553888479,\n \"acc_norm\": 0.23272490221642764,\n\
\ \"acc_norm_stderr\": 0.010792595553888479\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.02456220431414232,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.02456220431414232\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.02768297952296023,\n\
\ \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.02768297952296023\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.031157150869355568,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.031157150869355568\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\
\ \"acc_stderr\": 0.03070982405056527,\n \"acc_norm\": 0.1927710843373494,\n\
\ \"acc_norm_stderr\": 0.03070982405056527\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474203,\n \"mc2\": 0.419357246718368,\n\
\ \"mc2_stderr\": 0.015180505292617188\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5019731649565904,\n \"acc_stderr\": 0.014052376259225629\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/TinyMistral-248M-Instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|arc:challenge|25_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|gsm8k|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hellaswag|10_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T16-40-16.358250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-06T16-40-16.358250.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- '**/details_harness|winogrande|5_2023-12-06T16-40-16.358250.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-06T16-40-16.358250.parquet'
- config_name: results
data_files:
- split: 2023_12_06T16_40_16.358250
path:
- results_2023-12-06T16-40-16.358250.parquet
- split: latest
path:
- results_2023-12-06T16-40-16.358250.parquet
---
# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-Instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Locutusque/TinyMistral-248M-Instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Locutusque/TinyMistral-248M-Instruct](https://huggingface.co/Locutusque/TinyMistral-248M-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__TinyMistral-248M-Instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-06T16:40:16.358250](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248M-Instruct/blob/main/results_2023-12-06T16-40-16.358250.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.251231011993023,
"acc_stderr": 0.030796208549621222,
"acc_norm": 0.252037607935898,
"acc_norm_stderr": 0.03161677046697385,
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474203,
"mc2": 0.419357246718368,
"mc2_stderr": 0.015180505292617188
},
"harness|arc:challenge|25": {
"acc": 0.19965870307167236,
"acc_stderr": 0.011681625756888674,
"acc_norm": 0.2431740614334471,
"acc_norm_stderr": 0.012536554144587089
},
"harness|hellaswag|10": {
"acc": 0.27165903206532566,
"acc_stderr": 0.004439059440526251,
"acc_norm": 0.27524397530372435,
"acc_norm_stderr": 0.004457243336616505
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614865,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614865
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.035541803680256896,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.035541803680256896
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641143,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641143
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714506,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714506
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948368,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948368
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27419354838709675,
"acc_stderr": 0.025378139970885203,
"acc_norm": 0.27419354838709675,
"acc_norm_stderr": 0.025378139970885203
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.034531318018854146,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.034531318018854146
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.030746300742124505,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.030746300742124505
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29015544041450775,
"acc_stderr": 0.03275264467791516,
"acc_norm": 0.29015544041450775,
"acc_norm_stderr": 0.03275264467791516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26153846153846155,
"acc_stderr": 0.022282141204204426,
"acc_norm": 0.26153846153846155,
"acc_norm_stderr": 0.022282141204204426
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882367,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882367
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.033367670865679766,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.033367670865679766
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26422018348623855,
"acc_stderr": 0.018904164171510196,
"acc_norm": 0.26422018348623855,
"acc_norm_stderr": 0.018904164171510196
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.028353212866863448,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.028353212866863448
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.03198001660115069,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.03198001660115069
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29535864978902954,
"acc_stderr": 0.0296963387134229,
"acc_norm": 0.29535864978902954,
"acc_norm_stderr": 0.0296963387134229
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2600896860986547,
"acc_stderr": 0.029442495585857476,
"acc_norm": 0.2600896860986547,
"acc_norm_stderr": 0.029442495585857476
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.03547771004159462,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.03547771004159462
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.19008264462809918,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.19008264462809918,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24786324786324787,
"acc_stderr": 0.0282863240755644,
"acc_norm": 0.24786324786324787,
"acc_norm_stderr": 0.0282863240755644
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29118773946360155,
"acc_stderr": 0.01624608706970139,
"acc_norm": 0.29118773946360155,
"acc_norm_stderr": 0.01624608706970139
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.1994219653179191,
"acc_stderr": 0.02151190065425255,
"acc_norm": 0.1994219653179191,
"acc_norm_stderr": 0.02151190065425255
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.025457756696667895,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.025457756696667895
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21864951768488747,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.21864951768488747,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.02624492034984301,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.02624492034984301
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23272490221642764,
"acc_stderr": 0.010792595553888479,
"acc_norm": 0.23272490221642764,
"acc_norm_stderr": 0.010792595553888479
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.02456220431414232,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.02456220431414232
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.02768297952296023,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.02768297952296023
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355568,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355568
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.03070982405056527,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.03070982405056527
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474203,
"mc2": 0.419357246718368,
"mc2_stderr": 0.015180505292617188
},
"harness|winogrande|5": {
"acc": 0.5019731649565904,
"acc_stderr": 0.014052376259225629
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Porameht/spoonerism-kumpun-th-18up | ---
license: apache-2.0
---
|
Neurogpt/autotrain-data-stroke-classifier | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: stroke-classifier
## Dataset Description
This dataset has been automatically processed by AutoTrain for project stroke-classifier.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<233x197 L PIL image>",
"target": 0
},
{
"image": "<233x197 L PIL image>",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['notStroke', 'stroke'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1600 |
| valid | 945 |
|
CyberHarem/destroyer_hime_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of destroyer_hime/駆逐棲姫 (Kantai Collection)
This is the dataset of destroyer_hime/駆逐棲姫 (Kantai Collection), containing 34 images and their tags.
The core tags of this character are `long_hair, side_ponytail, white_hair, white_skin, colored_skin, hat, purple_eyes, pale_skin`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 40.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/destroyer_hime_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 29.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/destroyer_hime_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 77 | 54.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/destroyer_hime_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 38.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/destroyer_hime_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 77 | 67.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/destroyer_hime_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/destroyer_hime_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 34 |  |  |  |  |  | abyssal_ship, 1girl, serafuku, solo, skirt, sleeveless, bare_shoulders, choker, midriff, navel, black_gloves, looking_at_viewer, amputee, neckerchief |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | abyssal_ship | 1girl | serafuku | solo | skirt | sleeveless | bare_shoulders | choker | midriff | navel | black_gloves | looking_at_viewer | amputee | neckerchief |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------|:--------|:-----------|:-------|:--------|:-------------|:-----------------|:---------|:----------|:--------|:---------------|:--------------------|:----------|:--------------|
| 0 | 34 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-e1b364-31627144974 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: ARTeLab/it5-summarization-fanpage
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: ARTeLab/it5-summarization-fanpage
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sr5434](https://huggingface.co/sr5434) for evaluating this model. |
Mitsuki-Sakamoto/alpaca_farm-alpaca_instructions_gen_eval | ---
dataset_info:
- config_name: checkpoint-888
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
- name: reward
dtype: float64
splits:
- name: preference
num_bytes: 1497159
num_examples: 2000
- name: val
num_bytes: 185147
num_examples: 200
download_size: 582102
dataset_size: 1682306
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
- name: reward
dtype: float64
splits:
- name: preference
num_bytes: 1497159
num_examples: 2000
download_size: 491513
dataset_size: 1497159
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
- name: reward
dtype: float64
splits:
- name: preference
num_bytes: 1497159
num_examples: 2000
- name: val
num_bytes: 318995
num_examples: 200
download_size: 637987
dataset_size: 1816154
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_42dot_70m-checkpoint-50
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
- name: reward
dtype: float64
splits:
- name: preference
num_bytes: 1615352
num_examples: 2000
download_size: 514068
dataset_size: 1615352
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_gold-checkpoint-390
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
- name: reward
dtype: float64
splits:
- name: preference
num_bytes: 1530946
num_examples: 2000
download_size: 443131
dataset_size: 1530946
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_gold-checkpoint-78
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
- name: reward
dtype: float64
splits:
- name: preference
num_bytes: 1610230
num_examples: 2000
download_size: 518445
dataset_size: 1610230
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_gold_kl_0.1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
- name: reward
dtype: float64
splits:
- name: preference
num_bytes: 1497159
num_examples: 2000
- name: val
num_bytes: 142096
num_examples: 200
download_size: 543947
dataset_size: 1639255
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_gpt4_preference_70m-checkpoint-50
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
- name: reward
dtype: float64
splits:
- name: preference
num_bytes: 1622870
num_examples: 2000
download_size: 521608
dataset_size: 1622870
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_no_sft_70m-checkpoint-50
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
- name: reward
dtype: float64
splits:
- name: preference
num_bytes: 1627410
num_examples: 2000
download_size: 523238
dataset_size: 1627410
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self-checkpoint-390
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
splits:
- name: preference
num_bytes: 1538308
num_examples: 2000
download_size: 134225
dataset_size: 1538308
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self-checkpoint-78
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
splits:
- name: preference
num_bytes: 1538994
num_examples: 2000
download_size: 283626
dataset_size: 1538994
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_160m_kl_0.1_seed_0-checkpoint-154
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
- name: reward
dtype: float64
splits:
- name: preference
num_bytes: 1497159
num_examples: 2000
- name: val
num_bytes: 160694
num_examples: 200
download_size: 551625
dataset_size: 1657853
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_70m-checkpoint-100
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
splits:
- name: preference
num_bytes: 1661679
num_examples: 2000
download_size: 370873
dataset_size: 1661679
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_70m-checkpoint-25
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
splits:
- name: preference
num_bytes: 1571411
num_examples: 2000
download_size: 498862
dataset_size: 1571411
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_70m-checkpoint-50
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
- name: reward
dtype: float64
splits:
- name: preference
num_bytes: 1645073
num_examples: 2000
download_size: 530040
dataset_size: 1645073
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_70m-checkpoint-75
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: generator
dtype: string
- name: sample_mode
dtype: string
- name: dataset
dtype: string
- name: datasplit
dtype: string
- name: prompt_format
dtype: string
splits:
- name: preference
num_bytes: 1683575
num_examples: 2000
download_size: 489782
dataset_size: 1683575
configs:
- config_name: checkpoint-888
data_files:
- split: val
path: checkpoint-888/val-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa/preference-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: val
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/val-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_42dot_70m-checkpoint-50
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_42dot_70m-checkpoint-50/preference-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_gold-checkpoint-390
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_gold-checkpoint-390/preference-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_gold-checkpoint-78
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_gold-checkpoint-78/preference-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_gold_kl_0.1
data_files:
- split: val
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_gold_kl_0.1/val-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_gpt4_preference_70m-checkpoint-50
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_gpt4_preference_70m-checkpoint-50/preference-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_no_sft_70m-checkpoint-50
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_no_sft_70m-checkpoint-50/preference-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self-checkpoint-390
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self-checkpoint-390/preference-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self-checkpoint-78
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self-checkpoint-78/preference-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_160m_kl_0.1_seed_0-checkpoint-154
data_files:
- split: val
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_160m_kl_0.1_seed_0-checkpoint-154/val-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_70m-checkpoint-100
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_70m-checkpoint-100/preference-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_70m-checkpoint-25
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_70m-checkpoint-25/preference-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_70m-checkpoint-50
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_70m-checkpoint-50/preference-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_70m-checkpoint-75
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft_constant_pa_self_70m-checkpoint-75/preference-*
---
# Dataset Card for "alpaca_farm-alpaca_instructions_gen_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
crumb/c4-subset-for-truthfulqa | ---
dataset_info:
features:
- name: text
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 577836714
num_examples: 321153
download_size: 352256147
dataset_size: 577836714
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "c4-subset-for-truthfulqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Abrumu/Fashion_controlnet_dataset_V3 | ---
dataset_info:
features:
- name: target
dtype: image
- name: mask
dtype: image
- name: cloth
dtype: image
- name: control
dtype: image
- name: prompt
dtype: string
- name: CLIP_captions
dtype: string
splits:
- name: train
num_bytes: 7964862365.0
num_examples: 11647
download_size: 7944023014
dataset_size: 7964862365.0
---
# Dataset Card for "Fashion_controlnet_dataset_V3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kinugasa_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kinugasa (Kantai Collection)
This is the dataset of kinugasa (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `green_eyes, grey_hair, antenna_hair, breasts, long_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 696.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinugasa_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 398.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinugasa_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1215 | 833.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinugasa_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 621.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinugasa_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1215 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kinugasa_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kinugasa_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, beach, blue_sky, cleavage, day, looking_at_viewer, navel, ocean, outdoors, solo, yellow_bikini, cowboy_shot, horizon, smile, cloud, standing, medium_hair, sand |
| 1 | 5 |  |  |  |  |  | 1girl, blue_sky, cloud, day, horizon, navel, ocean, open_mouth, outdoors, smile, solo, standing, water, barefoot, beach, looking_at_viewer, yellow_bikini, cleavage, hair_tie, medium_hair, running, feet_out_of_frame |
| 2 | 5 |  |  |  |  |  | 1girl, blue_sky, cloud, cowboy_shot, day, frilled_bikini, looking_at_viewer, official_alternate_costume, outdoors, short_hair, short_twintails, side-tie_bikini_bottom, solo, white_shirt, beachball, blue_bikini, floral_print, large_breasts, ocean, smile, standing, tied_shirt |
| 3 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, navel, solo, alternate_costume, cleavage, full_body, side-tie_bikini_bottom, standing, yellow_bikini, barefoot, gold_bikini, large_breasts, open_mouth |
| 4 | 5 |  |  |  |  |  | 1girl, beachball, cleavage, looking_at_viewer, official_alternate_costume, sarong, solo, yellow_bikini, floral_print, navel, single_braid, collarbone, hair_over_shoulder, hair_tie, open_mouth, full_body, large_breasts, medium_hair, sandals, side-tie_bikini_bottom, sitting |
| 5 | 8 |  |  |  |  |  | 1girl, bikini, collarbone, looking_at_viewer, bangs, cleavage, simple_background, solo, smile, alternate_costume, yellow_background, barefoot, full_body, standing, upper_body |
| 6 | 5 |  |  |  |  |  | 1girl, alternate_costume, full_body, solo, yellow_shirt, hair_tie, long_sleeves, looking_at_viewer, red_footwear, sneakers, standing, white_skirt, one_side_up, open_mouth, smile, bangs, pink_background, shorts, simple_background, white_background |
| 7 | 8 |  |  |  |  |  | 1girl, alternate_costume, black_footwear, black_pantyhose, full_body, simple_background, sweater, white_background, solo, standing, long_sleeves, smile, looking_at_viewer, white_coat, black_skirt, high_heels, dress, fur-trimmed_coat, holding, open_mouth, scrunchie, shoes |
| 8 | 9 |  |  |  |  |  | 1girl, black_shirt, simple_background, official_alternate_costume, polka_dot_shirt, white_background, green_skirt, jacket, coat, smile, full_body, medium_hair, solo_focus, standing |
| 9 | 9 |  |  |  |  |  | 1girl, serafuku, short_sleeves, upper_body, blue_sailor_collar, looking_at_viewer, one_side_up, solo, white_background, simple_background, yellow_necktie, smile, gloves, open_mouth, blush, neckerchief |
| 10 | 21 |  |  |  |  |  | pleated_skirt, serafuku, yellow_necktie, 1girl, hair_tie, looking_at_viewer, solo, purple_skirt, simple_background, smile, one_side_up, purple_sailor_collar, black_thighhighs, white_background, black_gloves, blue_skirt |
| 11 | 26 |  |  |  |  |  | 1girl, alternate_costume, detached_collar, rabbit_ears, looking_at_viewer, playboy_bunny, simple_background, fake_animal_ears, solo, wrist_cuffs, bowtie, cleavage, strapless_leotard, white_background, black_pantyhose, open_mouth, cowboy_shot |
| 12 | 5 |  |  |  |  |  | 1girl, cleavage, navel, solo, simple_background, underwear_only, yellow_bra, collarbone, looking_at_viewer, lying, white_background, yellow_panties, arms_up, bangs, blush, medium_hair, smile |
| 13 | 5 |  |  |  |  |  | blush, large_breasts, nipples, nude, solo_focus, 1boy, 1girl, hair_tie, hetero, mosaic_censoring, sweat, navel, pussy, smile, breast_grab, collarbone, grabbing, lying, one_side_up, open_mouth, penis, tears, trembling |
| 14 | 9 |  |  |  |  |  | 1girl, obi, solo, alternate_costume, floral_print, looking_at_viewer, smile, wide_sleeves, open_mouth, print_kimono, hair_ornament, long_sleeves, ahoge, alternate_hairstyle, flower, new_year |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | beach | blue_sky | cleavage | day | looking_at_viewer | navel | ocean | outdoors | solo | yellow_bikini | cowboy_shot | horizon | smile | cloud | standing | medium_hair | sand | open_mouth | water | barefoot | hair_tie | running | feet_out_of_frame | frilled_bikini | official_alternate_costume | short_hair | short_twintails | side-tie_bikini_bottom | white_shirt | beachball | blue_bikini | floral_print | large_breasts | tied_shirt | alternate_costume | full_body | gold_bikini | sarong | single_braid | collarbone | hair_over_shoulder | sandals | sitting | bikini | bangs | simple_background | yellow_background | upper_body | yellow_shirt | long_sleeves | red_footwear | sneakers | white_skirt | one_side_up | pink_background | shorts | white_background | black_footwear | black_pantyhose | sweater | white_coat | black_skirt | high_heels | dress | fur-trimmed_coat | holding | scrunchie | shoes | black_shirt | polka_dot_shirt | green_skirt | jacket | coat | solo_focus | serafuku | short_sleeves | blue_sailor_collar | yellow_necktie | gloves | blush | neckerchief | pleated_skirt | purple_skirt | purple_sailor_collar | black_thighhighs | black_gloves | blue_skirt | detached_collar | rabbit_ears | playboy_bunny | fake_animal_ears | wrist_cuffs | bowtie | strapless_leotard | underwear_only | yellow_bra | lying | yellow_panties | arms_up | nipples | nude | 1boy | hetero | mosaic_censoring | sweat | pussy | breast_grab | grabbing | penis | tears | trembling | obi | wide_sleeves | print_kimono | hair_ornament | ahoge | alternate_hairstyle | flower | new_year |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:-----------|:-----------|:------|:--------------------|:--------|:--------|:-----------|:-------|:----------------|:--------------|:----------|:--------|:--------|:-----------|:--------------|:-------|:-------------|:--------|:-----------|:-----------|:----------|:--------------------|:-----------------|:-----------------------------|:-------------|:------------------|:-------------------------|:--------------|:------------|:--------------|:---------------|:----------------|:-------------|:--------------------|:------------|:--------------|:---------|:---------------|:-------------|:---------------------|:----------|:----------|:---------|:--------|:--------------------|:--------------------|:-------------|:---------------|:---------------|:---------------|:-----------|:--------------|:--------------|:------------------|:---------|:-------------------|:-----------------|:------------------|:----------|:-------------|:--------------|:-------------|:--------|:-------------------|:----------|:------------|:--------|:--------------|:------------------|:--------------|:---------|:-------|:-------------|:-----------|:----------------|:---------------------|:-----------------|:---------|:--------|:--------------|:----------------|:---------------|:-----------------------|:-------------------|:---------------|:-------------|:------------------|:--------------|:----------------|:-------------------|:--------------|:---------|:--------------------|:-----------------|:-------------|:--------|:-----------------|:----------|:----------|:-------|:-------|:---------|:-------------------|:--------|:--------|:--------------|:-----------|:--------|:--------|:------------|:------|:---------------|:---------------|:----------------|:--------|:----------------------|:---------|:-----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | | X | X | | X | X | X | | X | | X | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | X | X | | | X | X | | | | | X | | | X | | X | | | | | | | | X | | | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | X | | X | X | | | X | X | | | | | | X | | X | | | X | | | | X | | | X | | X | | X | X | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | | X | | X | | | | X | | | | X | | X | | | | | X | | | | | | | | | | | | | | | X | X | | | | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | | X | | | | X | | | | X | | X | | | X | | | X | | | | | | | | | | | | | | X | X | | | | | | | | | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | | | | | X | | | | X | | | | X | | X | | | X | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | X | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | | | | | | | | | | | | | X | | X | X | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 9 |  |  |  |  |  | X | | | | | X | | | | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 21 |  |  |  |  |  | X | | | | | X | | | | X | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | X | | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 26 |  |  |  |  |  | X | | | X | | X | | | | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 5 |  |  |  |  |  | X | | | X | | X | X | | | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 13 | 5 |  |  |  |  |  | X | | | | | | X | | | | | | | X | | | | | X | | | X | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 14 | 9 |  |  |  |  |  | X | | | | | X | | | | X | | | | X | | | | | X | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
HiTZ/AbstRCT-ES | ---
license: cc-by-nc-sa-4.0
language:
- es
pretty_name: AbstRCT-ES
---
---
dataset_info:
- config_name: es
data_files:
- split: neoplasm_train
path: es/neoplasm_train-*
- split: neoplasm_dev
path: es/neoplasm_dev-*
- split: neoplasm_test
path: es/neoplasm_test-*
- split: glaucoma_test
path: es/glaucoma_test-*
- split: mixed_test
path: es/mixed_test-*
license: apache-2.0
task_categories:
- token-classification
language:
- es
tags:
- biology
- medical
pretty_name: AbstRCT-ES
---
<p align="center">
<br>
<img src="http://www.ixa.eus/sites/default/files/anitdote.png" style="width: 30%;">
<h2 align="center">AbstRCT-ES</h2>
<be>
We translate the [AbstRCT English Argument Mining Dataset](https://gitlab.com/tomaye/abstrct) to generate a parallel Spanish version
using DeepL; labels are projected using [Easy Label Projection](https://github.com/ikergarcia1996/Easy-Label-Projection) and manually corrected.
- 📖 Paper: [Crosslingual Argument Mining in the Medical Domain](https://arxiv.org/abs/2301.10527)
- 🌐 Project Website: [https://univ-cotedazur.eu/antidote](https://univ-cotedazur.eu/antidote)
- Code: [https://github.com/ragerri/abstrct-projections/tree/final](https://github.com/ragerri/abstrct-projections/tree/final)
- Funding: CHIST-ERA XAI 2019 call. Antidote (PCI2020-120717-2) funded by MCIN/AEI /10.13039/501100011033 and by European Union NextGenerationEU/PRTR
## Labels
```python
{
"O": 0,
"B-Claim": 1,
"I-Claim": 2,
"B-Premise": 3,
"I-Premise": 4,
}
```
A `claim` is a concluding statement made by the author about the outcome of the study. In the medical domain it may be an assertion of a diagnosis or a treatment.
A `premise` corresponds to an observation or measurement in the study (ground truth), which supports or attacks another argument component, usually a claim.
It is important that they are observed facts, therefore, credible without further evidence.
## Citation
````bibtex
@misc{yeginbergen2024crosslingual,
title={Cross-lingual Argument Mining in the Medical Domain},
author={Anar Yeginbergen and Rodrigo Agerri},
year={2024},
eprint={2301.10527},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```` |
CyberHarem/caenis_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Caenis/カイニス/凯妮斯 (Fate/Grand Order)
This is the dataset of Caenis/カイニス/凯妮斯 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `white_hair, animal_ears, blue_eyes, dark_skin, breasts, dark-skinned_female, large_breasts, long_hair, hair_intakes, bangs, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 644.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caenis_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 377.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caenis_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1236 | 807.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caenis_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 577.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caenis_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1236 | 1.09 GiB | [Download](https://huggingface.co/datasets/CyberHarem/caenis_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/caenis_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, body_markings, navel, solo, tattoo, black_gloves, elbow_gloves, looking_at_viewer, muscular_female, ponytail, thighhighs, abs, black_bikini, cleavage, sitting |
| 1 | 43 |  |  |  |  |  | 1girl, body_markings, solo, headpiece, tattoo, elbow_gloves, pauldrons, navel, spear, shield, faulds, black_gloves, looking_at_viewer, gauntlets, highleg_bikini, black_thighhighs, black_bikini, cleavage, waist_cape, red_cape, ponytail, thighs, grin, abs |
| 2 | 8 |  |  |  |  |  | 1girl, headpiece, looking_at_viewer, solo, pauldrons, tattoo, body_markings, shield, spear, white_background, bikini, cleavage, grin, open_mouth |
| 3 | 11 |  |  |  |  |  | 1girl, solo, black_bikini, looking_at_viewer, tattoo, body_markings, navel, cleavage, white_background, simple_background, abs, smile, bare_shoulders, dog_tags, highleg_bikini |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_bikini, body_markings, cleavage, eyewear_on_head, grin, looking_at_viewer, solo, sunglasses, collarbone, very_long_hair, black_hairband, thighs, white_nails, wristband |
| 5 | 22 |  |  |  |  |  | 1girl, bare_shoulders, black_bikini, black_hairband, cleavage, collarbone, eyewear_on_head, solo, sunglasses, blue_sky, body_markings, cloud, looking_at_viewer, navel, very_long_hair, day, thighs, smile, white_nails, wristband, beach, ocean, bracelet, covered_nipples, open_mouth, outdoors, thigh_strap |
| 6 | 36 |  |  |  |  |  | 1girl, black_bikini, body_markings, tattoo, denim_shorts, navel, solo, looking_at_viewer, cleavage, highleg_bikini, collarbone, dog_tags, cutoffs, short_shorts, white_jacket, belt, open_jacket, jewelry, long_sleeves, single_thighhigh, smile, off_shoulder, white_nails |
| 7 | 6 |  |  |  |  |  | 1girl, navel, nipples, solo, tattoo, body_markings, completely_nude, collarbone, looking_at_viewer, simple_background, smile |
| 8 | 5 |  |  |  |  |  | bar_censor, elbow_gloves, sweat, 2girls, black_gloves, blonde_hair, interracial, navel, testicles, thighhighs, blush, bottomless, erection, futa_with_futa, large_penis, multiple_penises, tattoo, white_background, cum, futa_with_female, kneeling, muscular_female, smile, standing_sex, stomach_bulge, vaginal |
| 9 | 19 |  |  |  |  |  | rabbit_ears, fake_animal_ears, playboy_bunny, 1girl, bowtie, cleavage, detached_collar, looking_at_viewer, white_leotard, wrist_cuffs, red_pantyhose, solo, fishnet_pantyhose, highleg_leotard, black_gloves, very_long_hair, tail, thighs, strapless_leotard |
| 10 | 5 |  |  |  |  |  | pleated_skirt, school_uniform, 1girl, bag, cellphone, necktie, solo, collared_shirt, looking_at_viewer, single_thighhigh, thighs, white_nails, white_shirt, black_skirt, blue_skirt, cardigan, guitar_case, long_sleeves, underwear, very_long_hair, wristband, yellow_sweater |
| 11 | 7 |  |  |  |  |  | 1girl, juliet_sleeves, maid_headdress, solo, enmaided, maid_apron, black_dress, boots, braid, full_body, looking_at_viewer, clenched_teeth, smile, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | body_markings | navel | solo | tattoo | black_gloves | elbow_gloves | looking_at_viewer | muscular_female | ponytail | thighhighs | abs | black_bikini | cleavage | sitting | headpiece | pauldrons | spear | shield | faulds | gauntlets | highleg_bikini | black_thighhighs | waist_cape | red_cape | thighs | grin | white_background | bikini | open_mouth | simple_background | smile | bare_shoulders | dog_tags | eyewear_on_head | sunglasses | collarbone | very_long_hair | black_hairband | white_nails | wristband | blue_sky | cloud | day | beach | ocean | bracelet | covered_nipples | outdoors | thigh_strap | denim_shorts | cutoffs | short_shorts | white_jacket | belt | open_jacket | jewelry | long_sleeves | single_thighhigh | off_shoulder | nipples | completely_nude | bar_censor | sweat | 2girls | blonde_hair | interracial | testicles | blush | bottomless | erection | futa_with_futa | large_penis | multiple_penises | cum | futa_with_female | kneeling | standing_sex | stomach_bulge | vaginal | rabbit_ears | fake_animal_ears | playboy_bunny | bowtie | detached_collar | white_leotard | wrist_cuffs | red_pantyhose | fishnet_pantyhose | highleg_leotard | tail | strapless_leotard | pleated_skirt | school_uniform | bag | cellphone | necktie | collared_shirt | white_shirt | black_skirt | blue_skirt | cardigan | guitar_case | underwear | yellow_sweater | juliet_sleeves | maid_headdress | enmaided | maid_apron | black_dress | boots | braid | full_body | clenched_teeth |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:----------------|:--------|:-------|:---------|:---------------|:---------------|:--------------------|:------------------|:-----------|:-------------|:------|:---------------|:-----------|:----------|:------------|:------------|:--------|:---------|:---------|:------------|:-----------------|:-------------------|:-------------|:-----------|:---------|:-------|:-------------------|:---------|:-------------|:--------------------|:--------|:-----------------|:-----------|:------------------|:-------------|:-------------|:-----------------|:-----------------|:--------------|:------------|:-----------|:--------|:------|:--------|:--------|:-----------|:------------------|:-----------|:--------------|:---------------|:----------|:---------------|:---------------|:-------|:--------------|:----------|:---------------|:-------------------|:---------------|:----------|:------------------|:-------------|:--------|:---------|:--------------|:--------------|:------------|:--------|:-------------|:-----------|:-----------------|:--------------|:-------------------|:------|:-------------------|:-----------|:---------------|:----------------|:----------|:--------------|:-------------------|:----------------|:---------|:------------------|:----------------|:--------------|:----------------|:--------------------|:------------------|:-------|:--------------------|:----------------|:-----------------|:------|:------------|:----------|:-----------------|:--------------|:--------------|:-------------|:-----------|:--------------|:------------|:-----------------|:-----------------|:-----------------|:-----------|:-------------|:--------------|:--------|:--------|:------------|:-----------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 43 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | | X | X | | | X | | | | | | X | | X | X | X | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | X | X | X | | | X | | | | X | X | X | | | | | | | | X | | | | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | X | | | | X | | | | | X | X | | | | | | | | | | | | X | X | | | | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 22 |  |  |  |  |  | X | X | X | X | | | | X | | | | | X | X | | | | | | | | | | | | X | | | | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 36 |  |  |  |  |  | X | X | X | X | X | | | X | | | | | X | X | | | | | | | | X | | | | | | | | | | X | | X | | | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | | | X | | X | X | X | | X | | X | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 19 |  |  |  |  |  | X | | | X | | X | | X | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | | | X | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | X | X | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 11 | 7 |  |  |  |  |  | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
ccibeekeoc42/DollyHHRLHF_igbo | ---
language:
- en
- ig
license: apache-2.0
tags:
- machine-translation
- low-resource-languages
- igbo
- English
---
# DollyHHRLHF English-Igbo Parallel Corpus
## Description
TBD
## Composition
TBD
## Usage
TBD
## Acknowledgments
TBD
License
The translated datasets are released under Apache2.0, consistent with the original TinyStories dataset's licensing terms. Please refer to Microsoft's official release for further details on the licensing of the TinyStories dataset.
## About the Authors
[Christopher Ibe](https://www.linkedin.com/in/christopher-ibe-ekeocha/) and [Okezie Okoye](https://www.linkedin.com/in/okezie-okoye-43432b62/) continue to lead Hypa AI towards new frontiers in AI translation. Their dedication to leveraging advanced AI for genuine understanding and connection across language barriers is what sets Hypa AI apart in the field of artificial intelligence.
*Hypa AI* remains steadfast in its mission to pioneer intelligent solutions that are not just technologically advanced but are also culturally aware, ensuring that the future of AI is as diverse and inclusive as the world it serves.
*AfroVoices*, a subsidiary of Hypa AI, is dedicated to amplifying African voices, languages, and cultures in the intelligence age. Focused on bridging the digital representation gap, AfroVoices curates datasets and resources for African languages, promoting inclusivity and cultural appreciation in AI technologies. Their mission goes beyond technological innovation, aiming to celebrate the richness of African linguistic diversity on a global stage. |
hippocrates/HoC_1shot_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 7626108
num_examples: 1108
- name: valid
num_bytes: 1074483
num_examples: 157
- name: test
num_bytes: 2154888
num_examples: 315
download_size: 3384419
dataset_size: 10855479
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
spaablauw/FloralMarble_dataset | ---
license: wtfpl
---
35 dataset images for FloralMarble. Originally created an embedding for statues and busts on a colored background, then mixed that with various other embeddings, resulting in this dataset.
Trained for 500 epochs/steps. 35 images, 4 vectors. Batch size of 7, 5 grad acc steps, learning rate of 0.0025:250,0.001:500.





|
open-llm-leaderboard/details_lodrick-the-lafted__Platyboros-Instruct-7B | ---
pretty_name: Evaluation run of lodrick-the-lafted/Platyboros-Instruct-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lodrick-the-lafted/Platyboros-Instruct-7B](https://huggingface.co/lodrick-the-lafted/Platyboros-Instruct-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lodrick-the-lafted__Platyboros-Instruct-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T13:53:29.006472](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Platyboros-Instruct-7B/blob/main/results_2024-02-22T13-53-29.006472.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6191831487178028,\n\
\ \"acc_stderr\": 0.03277312394406138,\n \"acc_norm\": 0.6232527655604743,\n\
\ \"acc_norm_stderr\": 0.03343651852840389,\n \"mc1\": 0.4369645042839657,\n\
\ \"mc1_stderr\": 0.01736384450319598,\n \"mc2\": 0.6091776098105923,\n\
\ \"mc2_stderr\": 0.015467584736294537\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5486348122866894,\n \"acc_stderr\": 0.01454210456995527,\n\
\ \"acc_norm\": 0.5776450511945392,\n \"acc_norm_stderr\": 0.014434138713379977\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6351324437363075,\n\
\ \"acc_stderr\": 0.004804091708812544,\n \"acc_norm\": 0.8259310894244174,\n\
\ \"acc_norm_stderr\": 0.0037839381501516165\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880274,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.04878317312145634,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.04878317312145634\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n\
\ \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n\
\ \"acc_stderr\": 0.04928099597287533,\n \"acc_norm\": 0.43137254901960786,\n\
\ \"acc_norm_stderr\": 0.04928099597287533\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"\
acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7064516129032258,\n \"acc_stderr\": 0.02590608702131929,\n \"\
acc_norm\": 0.7064516129032258,\n \"acc_norm_stderr\": 0.02590608702131929\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723872,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723872\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215639,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215639\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8091743119266055,\n \"acc_stderr\": 0.016847676400091095,\n \"\
acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.016847676400091095\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082394,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082394\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990922,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990922\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n\
\ \"acc_stderr\": 0.014957458504335833,\n \"acc_norm\": 0.7739463601532567,\n\
\ \"acc_norm_stderr\": 0.014957458504335833\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.024883140570071762,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.024883140570071762\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3240223463687151,\n\
\ \"acc_stderr\": 0.01565254249642112,\n \"acc_norm\": 0.3240223463687151,\n\
\ \"acc_norm_stderr\": 0.01565254249642112\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046633,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046633\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632938,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632938\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.025976566010862744,\n\
\ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.025976566010862744\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291463,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291463\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45436766623207303,\n\
\ \"acc_stderr\": 0.012716941720734804,\n \"acc_norm\": 0.45436766623207303,\n\
\ \"acc_norm_stderr\": 0.012716941720734804\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6225490196078431,\n \"acc_stderr\": 0.019610851474880293,\n \
\ \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.019610851474880293\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4369645042839657,\n\
\ \"mc1_stderr\": 0.01736384450319598,\n \"mc2\": 0.6091776098105923,\n\
\ \"mc2_stderr\": 0.015467584736294537\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773223\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43669446550416985,\n \
\ \"acc_stderr\": 0.013661649780905488\n }\n}\n```"
repo_url: https://huggingface.co/lodrick-the-lafted/Platyboros-Instruct-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|arc:challenge|25_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|gsm8k|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hellaswag|10_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T13-53-29.006472.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T13-53-29.006472.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- '**/details_harness|winogrande|5_2024-02-22T13-53-29.006472.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T13-53-29.006472.parquet'
- config_name: results
data_files:
- split: 2024_02_22T13_53_29.006472
path:
- results_2024-02-22T13-53-29.006472.parquet
- split: latest
path:
- results_2024-02-22T13-53-29.006472.parquet
---
# Dataset Card for Evaluation run of lodrick-the-lafted/Platyboros-Instruct-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lodrick-the-lafted/Platyboros-Instruct-7B](https://huggingface.co/lodrick-the-lafted/Platyboros-Instruct-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lodrick-the-lafted__Platyboros-Instruct-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T13:53:29.006472](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Platyboros-Instruct-7B/blob/main/results_2024-02-22T13-53-29.006472.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6191831487178028,
"acc_stderr": 0.03277312394406138,
"acc_norm": 0.6232527655604743,
"acc_norm_stderr": 0.03343651852840389,
"mc1": 0.4369645042839657,
"mc1_stderr": 0.01736384450319598,
"mc2": 0.6091776098105923,
"mc2_stderr": 0.015467584736294537
},
"harness|arc:challenge|25": {
"acc": 0.5486348122866894,
"acc_stderr": 0.01454210456995527,
"acc_norm": 0.5776450511945392,
"acc_norm_stderr": 0.014434138713379977
},
"harness|hellaswag|10": {
"acc": 0.6351324437363075,
"acc_stderr": 0.004804091708812544,
"acc_norm": 0.8259310894244174,
"acc_norm_stderr": 0.0037839381501516165
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880274,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.02590608702131929,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.02590608702131929
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723872,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723872
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215639,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215639
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.016847676400091095,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.016847676400091095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082394,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082394
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990922,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990922
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335833,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335833
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.024883140570071762,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.024883140570071762
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3240223463687151,
"acc_stderr": 0.01565254249642112,
"acc_norm": 0.3240223463687151,
"acc_norm_stderr": 0.01565254249642112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046633,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046633
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632938,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632938
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.025976566010862744,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.025976566010862744
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291463,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291463
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45436766623207303,
"acc_stderr": 0.012716941720734804,
"acc_norm": 0.45436766623207303,
"acc_norm_stderr": 0.012716941720734804
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.019610851474880293,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.019610851474880293
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4369645042839657,
"mc1_stderr": 0.01736384450319598,
"mc2": 0.6091776098105923,
"mc2_stderr": 0.015467584736294537
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773223
},
"harness|gsm8k|5": {
"acc": 0.43669446550416985,
"acc_stderr": 0.013661649780905488
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lmms-lab/GQA | ---
license: mit
dataset_info:
- config_name: challenge_all_images
features:
- name: id
dtype: string
- name: image
dtype: image
splits:
- name: challenge
num_bytes: 261636425.25
num_examples: 1590
download_size: 261271928
dataset_size: 261636425.25
- config_name: challenge_all_instructions
features:
- name: id
dtype: string
- name: imageId
dtype: string
- name: question
dtype: string
- name: isBalanced
dtype: bool
splits:
- name: challenge
num_bytes: 50797705
num_examples: 713449
download_size: 19869828
dataset_size: 50797705
- config_name: challenge_balanced_images
features:
- name: id
dtype: string
- name: image
dtype: image
splits:
- name: challenge
num_bytes: 261636425.25
num_examples: 1590
download_size: 261333538
dataset_size: 261636425.25
- config_name: challenge_balanced_instructions
features:
- name: id
dtype: string
- name: imageId
dtype: string
- name: question
dtype: string
- name: isBalanced
dtype: bool
splits:
- name: challenge
num_bytes: 3523973
num_examples: 50726
download_size: 1787024
dataset_size: 3523973
- config_name: submission_all_images
features:
- name: id
dtype: string
- name: image
dtype: image
splits:
- name: submission
num_bytes: 2314978438.875
num_examples: 15545
download_size: 2309217874
dataset_size: 2314978438.875
- config_name: submission_all_instructions
features:
- name: id
dtype: string
- name: imageId
dtype: string
- name: question
dtype: string
- name: isBalanced
dtype: bool
splits:
- name: submission
num_bytes: 298875520
num_examples: 4237524
download_size: 121458425
dataset_size: 298875520
- config_name: test_all_images
features:
- name: id
dtype: string
- name: image
dtype: image
splits:
- name: test
num_bytes: 492571840.875
num_examples: 2993
download_size: 491611526
dataset_size: 492571840.875
- config_name: test_all_instructions
features:
- name: id
dtype: string
- name: imageId
dtype: string
- name: question
dtype: string
- name: isBalanced
dtype: bool
splits:
- name: test
num_bytes: 95588974
num_examples: 1340048
download_size: 39561711
dataset_size: 95588974
- config_name: test_balanced_images
features:
- name: id
dtype: string
- name: image
dtype: image
splits:
- name: test
num_bytes: 491210370.625
num_examples: 2987
download_size: 490293506
dataset_size: 491210370.625
- config_name: test_balanced_instructions
features:
- name: id
dtype: string
- name: imageId
dtype: string
- name: question
dtype: string
- name: isBalanced
dtype: bool
splits:
- name: test
num_bytes: 6622775
num_examples: 95336
download_size: 3401070
dataset_size: 6622775
- config_name: testdev_all_images
features:
- name: id
dtype: string
- name: image
dtype: image
splits:
- name: testdev
num_bytes: 65779269.0
num_examples: 398
download_size: 65670255
dataset_size: 65779269.0
- config_name: testdev_all_instructions
features:
- name: id
dtype: string
- name: imageId
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: fullAnswer
dtype: string
- name: isBalanced
dtype: bool
- name: groups
struct:
- name: global
dtype: string
- name: local
dtype: string
- name: entailed
dtype: string
- name: equivalent
dtype: string
- name: types
struct:
- name: structural
dtype: string
- name: semantic
dtype: string
- name: detailed
dtype: string
- name: annotations
sequence:
- name: question
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: answer
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: fullAnswer
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: semantic
list:
- name: operation
dtype: string
- name: argument
dtype: string
- name: dependencies
sequence: int32
- name: semanticStr
dtype: string
splits:
- name: testdev
num_bytes: 86970760
num_examples: 172174
download_size: 23385535
dataset_size: 86970760
- config_name: testdev_balanced_images
features:
- name: id
dtype: string
- name: image
dtype: image
splits:
- name: testdev
num_bytes: 65779269.0
num_examples: 398
download_size: 65647745
dataset_size: 65779269.0
- config_name: testdev_balanced_instructions
features:
- name: id
dtype: string
- name: imageId
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: fullAnswer
dtype: string
- name: isBalanced
dtype: bool
- name: groups
struct:
- name: global
dtype: string
- name: local
dtype: string
- name: entailed
dtype: string
- name: equivalent
dtype: string
- name: types
struct:
- name: structural
dtype: string
- name: semantic
dtype: string
- name: detailed
dtype: string
- name: annotations
sequence:
- name: question
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: answer
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: fullAnswer
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: semantic
list:
- name: operation
dtype: string
- name: argument
dtype: string
- name: dependencies
sequence: int32
- name: semanticStr
dtype: string
splits:
- name: testdev
num_bytes: 6113469
num_examples: 12578
download_size: 2090335
dataset_size: 6113469
- config_name: train_all_images
features:
- name: id
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 10509758457.0
num_examples: 74256
download_size: 10480239090
dataset_size: 10509758457.0
- config_name: train_all_instructions
features:
- name: id
dtype: string
- name: imageId
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: fullAnswer
dtype: string
- name: isBalanced
dtype: bool
- name: groups
struct:
- name: global
dtype: string
- name: local
dtype: string
- name: entailed
dtype: string
- name: equivalent
dtype: string
- name: types
struct:
- name: structural
dtype: string
- name: semantic
dtype: string
- name: detailed
dtype: string
- name: annotations
sequence:
- name: question
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: answer
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: fullAnswer
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: semantic
list:
- name: operation
dtype: string
- name: argument
dtype: string
- name: dependencies
sequence: int32
- name: semanticStr
dtype: string
splits:
- name: train
num_bytes: 6891129609
num_examples: 14305356
download_size: 1874173198
dataset_size: 6891129609
- config_name: train_balanced_images
features:
- name: id
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 10200292415.5
num_examples: 72140
download_size: 10171627271
dataset_size: 10200292415.5
- config_name: train_balanced_instructions
features:
- name: id
dtype: string
- name: imageId
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: fullAnswer
dtype: string
- name: isBalanced
dtype: bool
- name: groups
struct:
- name: global
dtype: string
- name: local
dtype: string
- name: entailed
dtype: string
- name: equivalent
dtype: string
- name: types
struct:
- name: structural
dtype: string
- name: semantic
dtype: string
- name: detailed
dtype: string
- name: annotations
sequence:
- name: question
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: answer
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: fullAnswer
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: semantic
list:
- name: operation
dtype: string
- name: argument
dtype: string
- name: dependencies
sequence: int32
- name: semanticStr
dtype: string
splits:
- name: train
num_bytes: 460429581
num_examples: 943000
download_size: 183979778
dataset_size: 460429581
- config_name: val_all_images
features:
- name: id
dtype: string
- name: image
dtype: image
splits:
- name: val
num_bytes: 1494990904.5
num_examples: 10564
download_size: 1490744689
dataset_size: 1494990904.5
- config_name: val_all_instructions
features:
- name: id
dtype: string
- name: imageId
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: fullAnswer
dtype: string
- name: isBalanced
dtype: bool
- name: groups
struct:
- name: global
dtype: string
- name: local
dtype: string
- name: entailed
dtype: string
- name: equivalent
dtype: string
- name: types
struct:
- name: structural
dtype: string
- name: semantic
dtype: string
- name: detailed
dtype: string
- name: annotations
sequence:
- name: question
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: answer
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: fullAnswer
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: semantic
list:
- name: operation
dtype: string
- name: argument
dtype: string
- name: dependencies
sequence: int32
- name: semanticStr
dtype: string
splits:
- name: val
num_bytes: 967338322
num_examples: 2011853
download_size: 266476025
dataset_size: 967338322
- config_name: val_balanced_images
features:
- name: id
dtype: string
- name: image
dtype: image
splits:
- name: val
num_bytes: 1447074448.75
num_examples: 10234
download_size: 1443033919
dataset_size: 1447074448.75
- config_name: val_balanced_instructions
features:
- name: id
dtype: string
- name: imageId
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: fullAnswer
dtype: string
- name: isBalanced
dtype: bool
- name: groups
struct:
- name: global
dtype: string
- name: local
dtype: string
- name: entailed
dtype: string
- name: equivalent
dtype: string
- name: types
struct:
- name: structural
dtype: string
- name: semantic
dtype: string
- name: detailed
dtype: string
- name: annotations
sequence:
- name: question
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: answer
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: fullAnswer
struct:
- name: objectId
dtype: string
- name: value
dtype: string
- name: semantic
list:
- name: operation
dtype: string
- name: argument
dtype: string
- name: dependencies
sequence: int32
- name: semanticStr
dtype: string
splits:
- name: val
num_bytes: 64498952
num_examples: 132062
download_size: 25794272
dataset_size: 64498952
configs:
- config_name: challenge_all_images
data_files:
- split: challenge
path: challenge_all_images/challenge-*
- config_name: challenge_all_instructions
data_files:
- split: challenge
path: challenge_all_instructions/challenge-*
- config_name: challenge_balanced_images
data_files:
- split: challenge
path: challenge_balanced_images/challenge-*
- config_name: challenge_balanced_instructions
data_files:
- split: challenge
path: challenge_balanced_instructions/challenge-*
- config_name: submission_all_images
data_files:
- split: submission
path: submission_all_images/submission-*
- config_name: submission_all_instructions
data_files:
- split: submission
path: submission_all_instructions/submission-*
- config_name: test_all_images
data_files:
- split: test
path: test_all_images/test-*
- config_name: test_all_instructions
data_files:
- split: test
path: test_all_instructions/test-*
- config_name: test_balanced_images
data_files:
- split: test
path: test_balanced_images/test-*
- config_name: test_balanced_instructions
data_files:
- split: test
path: test_balanced_instructions/test-*
- config_name: testdev_all_images
data_files:
- split: testdev
path: testdev_all_images/testdev-*
- config_name: testdev_all_instructions
data_files:
- split: testdev
path: testdev_all_instructions/testdev-*
- config_name: testdev_balanced_images
data_files:
- split: testdev
path: testdev_balanced_images/testdev-*
- config_name: testdev_balanced_instructions
data_files:
- split: testdev
path: testdev_balanced_instructions/testdev-*
- config_name: train_all_images
data_files:
- split: train
path: train_all_images/train-*
- config_name: train_all_instructions
data_files:
- split: train
path: train_all_instructions/train-*
- config_name: train_balanced_images
data_files:
- split: train
path: train_balanced_images/train-*
- config_name: train_balanced_instructions
data_files:
- split: train
path: train_balanced_instructions/train-*
- config_name: val_all_images
data_files:
- split: val
path: val_all_images/val-*
- config_name: val_all_instructions
data_files:
- split: val
path: val_all_instructions/val-*
- config_name: val_balanced_images
data_files:
- split: val
path: val_balanced_images/val-*
- config_name: val_balanced_instructions
data_files:
- split: val
path: val_balanced_instructions/val-*
---
<p align="center" width="100%">
<img src="https://i.postimg.cc/g0QRgMVv/WX20240228-113337-2x.png" width="100%" height="80%">
</p>
# Large-scale Multi-modality Models Evaluation Suite
> Accelerating the development of large-scale multi-modality models (LMMs) with `lmms-eval`
🏠 [Homepage](https://lmms-lab.github.io/) | 📚 [Documentation](docs/README.md) | 🤗 [Huggingface Datasets](https://huggingface.co/lmms-lab)
# This Dataset
This is a formatted version of [GQA](hhttps://cs.stanford.edu/people/dorarad/gqa/about.html). It is used in our `lmms-eval` pipeline to allow for one-click evaluations of large multi-modality models.
```
@inproceedings{hudson2019gqa,
title={Gqa: A new dataset for real-world visual reasoning and compositional question answering},
author={Hudson, Drew A and Manning, Christopher D},
booktitle={Proceedings of the IEEE/CVF conference on computer vision and pattern recognition},
pages={6700--6709},
year={2019}
}
``` |
tubasid/toy-car-annotation-YOLO | ---
license: apache-2.0
task_categories:
- image-classification
language:
- en
tags:
- yolo
- opensource
- computervision
- imageprocessing
- yolov3
- yplov4
- labelimg
pretty_name: ToyCarAnnotation
size_categories:
- n<1K
---
Hey everyone,
In my final year project, I created **Smart Traffic Management System**.
The project was to manage traffic lights' delays based on the number of vehicles on road.
I made everything worked using Raspberry Pi and pre-recorded videos but it was a "final year project", it was needed to be tested by changing videos frequently which was a kind of hustle. Collecting tons of videos and loading them in Pi was not too hard but it would have cost time, by every time changing names of videos in the code. Also, it was not possible to implement it in real *(unless govt. would have permitted me, hehe)*. So I chose to showcase my work by making a beautiful prototype.
[](https://postimg.cc/jDBySvRP)
I know, the image isn't so appealing, I apologise for that, but you got the idea, right.
I placed my cars on tracks and took real-time video of the lanes from the two cameras attached to two big sticks.
***Why only two cameras when there are four roads?***
Raspberry Pi supports only two cameras. In my case, the indexes were 0 and 2.
But to make things work as I have planned, I cropped images for each lane.
***What does it mean?***
Let us take one camera and the respective two roads as an example.
I took real-time video, performed image framing on it. Since the roads beneath the cars were supposed to be still *(obvio, cars move, not roads :>)*, I performed image framing after every 2 seconds of the video. The images were first cropped and then saved in the Pi. I resized the images, found the coordinates on which the two roads were separating, cropped the image till those coordinates and got 2 images of 2 separate roads from 1 camera.
Finally, I ran my code and I found it could only detect a few cars. I thought real and toy ones looked quite similar, but the model didn't think the same. My YOLO weight file was trained on original cars and now I had to do training, again.
I looked for datasets already available but couldn't find any. So I decided to make one.
I collected images from different web sources and performed the most important task on each of them. ***ANNOTATION***, using LabelImg.
I separately annotated around 1000 images, in YOLO format, did all the processing and created this dataset. Usually, for YOLO especially, you get pictures on the internet but not text files. You have to individually perform annotation on all of them. It takes time and there isn't any tool to do it in bulk because you have to properly tell how many cars are there in the picture. Maybe in the future, LableImg gets updated with some machine learning algorithm for detecting and annotating images automatically (who knows).
So here it is for your help.
I will be adding the notebook as well in some time.
Any questions? drop down below. Do like if it’s helpful.
***You can find me on:***
[https://www.github.com/tubasid](url)
[https://www.linkedin.com/in/tubasid](url)
[https://www.twitter.com/in/tubaasid](url)
[https://www.discord.com/channels/@tubasid](url)
Until next post.
***TubaSid***
|
botbot-ai/biology-ptbr | ---
license: cc-by-nc-4.0
language:
- pt
tags:
- instruction-finetuning
pretty_name: CAMEL Biology PTBR
task_categories:
- text-generation
---
## Tradução do Camel Biology dataset para Portuguese (PT-BR) usando NLLB 3.3b.
# **CAMEL: Communicative Agents for “Mind” Exploration of Large Scale Language Model Society**
- **Github:** https://github.com/lightaime/camel
- **Website:** https://www.camel-ai.org/
- **Arxiv Paper:** https://arxiv.org/abs/2303.17760
## Dataset Summary
Biology dataset is composed of 20K problem-solution pairs obtained using gpt-4. The dataset problem-solutions pairs generating from 25 biology topics, 25 subtopics for each topic and 32 problems for each "topic,subtopic" pairs.
We provide the data in `biology.zip`.
## Data Fields
**The data fields for files in `biology.zip` are as follows:**
* `role_1`: assistant role
* `topic`: biology topic
* `sub_topic`: biology subtopic belonging to topic
* `message_1`: refers to the problem the assistant is asked to solve.
* `message_2`: refers to the solution provided by the assistant.
**Download in python**
```
from huggingface_hub import hf_hub_download
hf_hub_download(repo_id="camel-ai/biology", repo_type="dataset", filename="biology.zip",
local_dir="datasets/", local_dir_use_symlinks=False)
```
### Citation
```
@misc{li2023camel,
title={CAMEL: Communicative Agents for "Mind" Exploration of Large Scale Language Model Society},
author={Guohao Li and Hasan Abed Al Kader Hammoud and Hani Itani and Dmitrii Khizbullin and Bernard Ghanem},
year={2023},
eprint={2303.17760},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
## Disclaimer:
This data was synthetically generated by GPT4 and might contain incorrect information. The dataset is there only for research purposes.
---
license: cc-by-nc-4.0
--- |
open-llm-leaderboard/details_chargoddard__Chronorctypus-Limarobormes-13b | ---
pretty_name: Evaluation run of chargoddard/Chronorctypus-Limarobormes-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/Chronorctypus-Limarobormes-13b](https://huggingface.co/chargoddard/Chronorctypus-Limarobormes-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__Chronorctypus-Limarobormes-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T10:27:33.460587](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__Chronorctypus-Limarobormes-13b/blob/main/results_2023-10-17T10-27-33.460587.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.05169882550335571,\n\
\ \"em_stderr\": 0.0022675304823078276,\n \"f1\": 0.17888317953020105,\n\
\ \"f1_stderr\": 0.0028882183973903902,\n \"acc\": 0.39147173871286817,\n\
\ \"acc_stderr\": 0.008785918503769254\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.05169882550335571,\n \"em_stderr\": 0.0022675304823078276,\n\
\ \"f1\": 0.17888317953020105,\n \"f1_stderr\": 0.0028882183973903902\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03866565579984837,\n \
\ \"acc_stderr\": 0.005310583162098035\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440473\n\
\ }\n}\n```"
repo_url: https://huggingface.co/chargoddard/Chronorctypus-Limarobormes-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T10_27_33.460587
path:
- '**/details_harness|drop|3_2023-10-17T10-27-33.460587.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T10-27-33.460587.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T10_27_33.460587
path:
- '**/details_harness|gsm8k|5_2023-10-17T10-27-33.460587.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T10-27-33.460587.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T10_27_33.460587
path:
- '**/details_harness|winogrande|5_2023-10-17T10-27-33.460587.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T10-27-33.460587.parquet'
- config_name: results
data_files:
- split: 2023_10_17T10_27_33.460587
path:
- results_2023-10-17T10-27-33.460587.parquet
- split: latest
path:
- results_2023-10-17T10-27-33.460587.parquet
---
# Dataset Card for Evaluation run of chargoddard/Chronorctypus-Limarobormes-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/Chronorctypus-Limarobormes-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/Chronorctypus-Limarobormes-13b](https://huggingface.co/chargoddard/Chronorctypus-Limarobormes-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__Chronorctypus-Limarobormes-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T10:27:33.460587](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__Chronorctypus-Limarobormes-13b/blob/main/results_2023-10-17T10-27-33.460587.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.05169882550335571,
"em_stderr": 0.0022675304823078276,
"f1": 0.17888317953020105,
"f1_stderr": 0.0028882183973903902,
"acc": 0.39147173871286817,
"acc_stderr": 0.008785918503769254
},
"harness|drop|3": {
"em": 0.05169882550335571,
"em_stderr": 0.0022675304823078276,
"f1": 0.17888317953020105,
"f1_stderr": 0.0028882183973903902
},
"harness|gsm8k|5": {
"acc": 0.03866565579984837,
"acc_stderr": 0.005310583162098035
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440473
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
xixixi/images | ---
license: other
---
|
jiangyige/5_types_paraphrased_sentence_pairs | ---
license: openrail
---
|
NickKolok/regs-sunshinemix | ---
license: agpl-3.0
---
|
tyzhu/find_first_sent_train_30_eval_10_baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 49979
num_examples: 30
- name: validation
num_bytes: 18259
num_examples: 10
download_size: 0
dataset_size: 68238
---
# Dataset Card for "find_first_sent_train_30_eval_10_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-squad-plain_text-f76498-1781661804 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad
eval_info:
task: extractive_question_answering
model: csarron/bert-base-uncased-squad-v1
metrics: []
dataset_name: squad
dataset_config: plain_text
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: csarron/bert-base-uncased-squad-v1
* Dataset: squad
* Config: plain_text
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nbroad](https://huggingface.co/nbroad) for evaluating this model. |
autoevaluate/autoeval-staging-eval-project-cd279959-d310-4487-bd83-52389ad5ed20-107105 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: autoevaluate/binary-classification
metrics: ['matthews_correlation']
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: autoevaluate/binary-classification
* Dataset: glue
* Config: sst2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
Tverous/mnli-amr | ---
dataset_info:
features:
- name: promptID
dtype: int32
- name: pairID
dtype: string
- name: premise
dtype: string
- name: premise_binary_parse
dtype: string
- name: premise_parse
dtype: string
- name: hypothesis
dtype: string
- name: hypothesis_binary_parse
dtype: string
- name: hypothesis_parse
dtype: string
- name: genre
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: claim_cleaned_amr
dtype: string
- name: amr_penman
dtype: string
- name: amr_tokens
sequence: string
- name: amr_nodes
dtype: string
- name: amr_alignments
dtype: string
- name: amr_edges
sequence:
sequence: string
splits:
- name: train
num_bytes: 805968455
num_examples: 392702
- name: dev
num_bytes: 19916906
num_examples: 9815
download_size: 353391877
dataset_size: 825885361
---
# Dataset Card for "mnli-amr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
samarthshrivas/lofi_dataset_2048_256 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: audio_file
dtype: string
- name: slice
dtype: int16
splits:
- name: train
num_bytes: 732154478.75
num_examples: 2426
download_size: 732059770
dataset_size: 732154478.75
---
# Dataset Card for "lofi_dataset_2048_256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nc33/keep_context_cross_encoder | ---
license: mit
---
|
open-llm-leaderboard/details_grimjim__Mistral-Starling-merge-trial3-7B | ---
pretty_name: Evaluation run of grimjim/Mistral-Starling-merge-trial3-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [grimjim/Mistral-Starling-merge-trial3-7B](https://huggingface.co/grimjim/Mistral-Starling-merge-trial3-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_grimjim__Mistral-Starling-merge-trial3-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T02:07:52.084167](https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__Mistral-Starling-merge-trial3-7B/blob/main/results_2024-03-30T02-07-52.084167.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.642314601113494,\n\
\ \"acc_stderr\": 0.03236473847693005,\n \"acc_norm\": 0.6456984682050072,\n\
\ \"acc_norm_stderr\": 0.033011642369703616,\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088383,\n \"mc2\": 0.5284795774255147,\n\
\ \"mc2_stderr\": 0.015199789892745523\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6203071672354948,\n \"acc_stderr\": 0.014182119866974872,\n\
\ \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441372\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6480780720971918,\n\
\ \"acc_stderr\": 0.004765937515197188,\n \"acc_norm\": 0.8481378211511651,\n\
\ \"acc_norm_stderr\": 0.0035815378475818026\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646796,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646796\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266237,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266237\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477114,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n\
\ \"acc_stderr\": 0.02646056956124063,\n \"acc_norm\": 0.8284313725490197,\n\
\ \"acc_norm_stderr\": 0.02646056956124063\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233483,\n\
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233483\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092365,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066302,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468355,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468355\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3407821229050279,\n\
\ \"acc_stderr\": 0.015852002449862106,\n \"acc_norm\": 0.3407821229050279,\n\
\ \"acc_norm_stderr\": 0.015852002449862106\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135118,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135118\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303957,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303957\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017204,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826368,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826368\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088383,\n \"mc2\": 0.5284795774255147,\n\
\ \"mc2_stderr\": 0.015199789892745523\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625849\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5299469294920395,\n \
\ \"acc_stderr\": 0.013747759685444704\n }\n}\n```"
repo_url: https://huggingface.co/grimjim/Mistral-Starling-merge-trial3-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|arc:challenge|25_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|arc:challenge|25_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|gsm8k|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|gsm8k|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hellaswag|10_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hellaswag|10_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T00-18-35.660444.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T02-07-52.084167.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T02-07-52.084167.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- '**/details_harness|winogrande|5_2024-03-30T00-18-35.660444.parquet'
- split: 2024_03_30T02_07_52.084167
path:
- '**/details_harness|winogrande|5_2024-03-30T02-07-52.084167.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T02-07-52.084167.parquet'
- config_name: results
data_files:
- split: 2024_03_30T00_18_35.660444
path:
- results_2024-03-30T00-18-35.660444.parquet
- split: 2024_03_30T02_07_52.084167
path:
- results_2024-03-30T02-07-52.084167.parquet
- split: latest
path:
- results_2024-03-30T02-07-52.084167.parquet
---
# Dataset Card for Evaluation run of grimjim/Mistral-Starling-merge-trial3-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [grimjim/Mistral-Starling-merge-trial3-7B](https://huggingface.co/grimjim/Mistral-Starling-merge-trial3-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_grimjim__Mistral-Starling-merge-trial3-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T02:07:52.084167](https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__Mistral-Starling-merge-trial3-7B/blob/main/results_2024-03-30T02-07-52.084167.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.642314601113494,
"acc_stderr": 0.03236473847693005,
"acc_norm": 0.6456984682050072,
"acc_norm_stderr": 0.033011642369703616,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088383,
"mc2": 0.5284795774255147,
"mc2_stderr": 0.015199789892745523
},
"harness|arc:challenge|25": {
"acc": 0.6203071672354948,
"acc_stderr": 0.014182119866974872,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441372
},
"harness|hellaswag|10": {
"acc": 0.6480780720971918,
"acc_stderr": 0.004765937515197188,
"acc_norm": 0.8481378211511651,
"acc_norm_stderr": 0.0035815378475818026
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646796,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646796
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474887,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.031584153240477114,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.031584153240477114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970565,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970565
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976044,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976044
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124063,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124063
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233483,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233483
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092365,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066302,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468355,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468355
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3407821229050279,
"acc_stderr": 0.015852002449862106,
"acc_norm": 0.3407821229050279,
"acc_norm_stderr": 0.015852002449862106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824782,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135118,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135118
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303957,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303957
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017204,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017204
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826368,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826368
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088383,
"mc2": 0.5284795774255147,
"mc2_stderr": 0.015199789892745523
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625849
},
"harness|gsm8k|5": {
"acc": 0.5299469294920395,
"acc_stderr": 0.013747759685444704
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_nasiruddin15__Mistral-grok-instract-2-7B-slerp | ---
pretty_name: Evaluation run of nasiruddin15/Mistral-grok-instract-2-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nasiruddin15/Mistral-grok-instract-2-7B-slerp](https://huggingface.co/nasiruddin15/Mistral-grok-instract-2-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nasiruddin15__Mistral-grok-instract-2-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T21:02:01.281494](https://huggingface.co/datasets/open-llm-leaderboard/details_nasiruddin15__Mistral-grok-instract-2-7B-slerp/blob/main/results_2024-03-29T21-02-01.281494.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6093138686193876,\n\
\ \"acc_stderr\": 0.03301888332986811,\n \"acc_norm\": 0.6143980622557884,\n\
\ \"acc_norm_stderr\": 0.03368555834758194,\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.5350694411001227,\n\
\ \"mc2_stderr\": 0.01564827007130759\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.01443803622084803,\n\
\ \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844461\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6318462457677754,\n\
\ \"acc_stderr\": 0.004813177057496269,\n \"acc_norm\": 0.8303126867157936,\n\
\ \"acc_norm_stderr\": 0.0037459074237766957\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n\
\ \"acc_stderr\": 0.027218889773308757,\n \"acc_norm\": 0.6451612903225806,\n\
\ \"acc_norm_stderr\": 0.027218889773308757\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694834,\n\
\ \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694834\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763079,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763079\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\"\
: 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"\
acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707779,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707779\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n\
\ \"acc_stderr\": 0.014385525076611578,\n \"acc_norm\": 0.7969348659003831,\n\
\ \"acc_norm_stderr\": 0.014385525076611578\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.016513676031179595,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.016513676031179595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.02638527370346449,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.02638527370346449\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.02570264026060374,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.02570264026060374\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.012671902782567652,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.012671902782567652\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578655,\n \
\ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578655\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919798,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.5350694411001227,\n\
\ \"mc2_stderr\": 0.01564827007130759\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836673\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39878695981804396,\n \
\ \"acc_stderr\": 0.013487360477060839\n }\n}\n```"
repo_url: https://huggingface.co/nasiruddin15/Mistral-grok-instract-2-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-02-01.281494.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-02-01.281494.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- '**/details_harness|winogrande|5_2024-03-29T21-02-01.281494.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T21-02-01.281494.parquet'
- config_name: results
data_files:
- split: 2024_03_29T21_02_01.281494
path:
- results_2024-03-29T21-02-01.281494.parquet
- split: latest
path:
- results_2024-03-29T21-02-01.281494.parquet
---
# Dataset Card for Evaluation run of nasiruddin15/Mistral-grok-instract-2-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nasiruddin15/Mistral-grok-instract-2-7B-slerp](https://huggingface.co/nasiruddin15/Mistral-grok-instract-2-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nasiruddin15__Mistral-grok-instract-2-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T21:02:01.281494](https://huggingface.co/datasets/open-llm-leaderboard/details_nasiruddin15__Mistral-grok-instract-2-7B-slerp/blob/main/results_2024-03-29T21-02-01.281494.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6093138686193876,
"acc_stderr": 0.03301888332986811,
"acc_norm": 0.6143980622557884,
"acc_norm_stderr": 0.03368555834758194,
"mc1": 0.37576499388004897,
"mc1_stderr": 0.01695458406021429,
"mc2": 0.5350694411001227,
"mc2_stderr": 0.01564827007130759
},
"harness|arc:challenge|25": {
"acc": 0.5767918088737202,
"acc_stderr": 0.01443803622084803,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.6318462457677754,
"acc_stderr": 0.004813177057496269,
"acc_norm": 0.8303126867157936,
"acc_norm_stderr": 0.0037459074237766957
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.027218889773308757,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.027218889773308757
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.025088301454694834,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.025088301454694834
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.01726674208763079,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.01726674208763079
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917669,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707779,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707779
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611578,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.016513676031179595,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.016513676031179595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.02638527370346449,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.02638527370346449
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.02570264026060374,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.02570264026060374
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567652,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567652
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.01931267606578655,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.01931267606578655
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919798,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37576499388004897,
"mc1_stderr": 0.01695458406021429,
"mc2": 0.5350694411001227,
"mc2_stderr": 0.01564827007130759
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836673
},
"harness|gsm8k|5": {
"acc": 0.39878695981804396,
"acc_stderr": 0.013487360477060839
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-one-sec-cv12/chunk_251 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 815478708
num_examples: 160149
download_size: 832005371
dataset_size: 815478708
---
# Dataset Card for "chunk_251"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
astroy/WHU-Urban-3D | ---
license: cc-by-nc-sa-4.0
---
<a href="https://hydra.cc/"><img alt="Config: Hydra" src="https://img.shields.io/badge/dataset-whu3d-green"></a> <a href="https://pytorch.org/get-started/locally/"><img alt="PyTorch" src="https://img.shields.io/badge/PyTorch-ee4c2c?logo=pytorch&logoColor=white"></a>
# Installation
In order to use the pywhu3d tool, you need to install the pwhu3d library for your interpreter. We recommend you use python=3.7 to follow this tutorial.
```zsh
# this will install the latest version of pywhu3d
pip install pywhu3d
```
# Usage
## Initialization
Create a WHU3D object:
```python
from pywhu3d.tool import WHU3D
data_root = '/data/datasets/whu'
scenes = ['0404', '0940']
# whu3d = WHU3D(data_root=data_root, data_type='mls', format='txt')
whu3d = WHU3D(data_root=data_root, data_type='mls', format='h5', scenes=scenes)
```
Parameters:
- **data_root**: [data root folder]
- **data_type**: `als`, `mls`, `pc`, `img`
- **format**: `txt`, `ply`, `npy`, `h5`, `pickle`
- **[optional] scenes**: a list of scenes, if not specified, will be represented by all of the files
The structure of the data folder should be like this:
```
data_root
├── images
├── als
│ ├── h5
│ │ ├── [scene_1].h5
│ │ ├── [scene_2].h5
│ │ └── [scene_*].h5
│ └── [optional] pkl/npy/pth
└── mls
├── h5
│ ├── [scene_1].h5
│ ├── [scene_2].h5
│ └── [scene_*].h5
└── [optional] pkl/npy/pth
```
It is also recommended to use default split scenes to create a whu3d object, by using `whu3d.train_split`.
```python
# print(whu3d.split.val)
whu3d = WHU3D(data_root=data_root, data_type='mls', format='txt', scenes=whu3d.val_split)
```
Then some of the attributes could be directly accessed, including data_root, data_type, scenes, download_link
```python
# e.g., you could print current scenes
print(whu3d.scenes)
```
## Attributes
The attributes of whu3d may differ depending on your operations (e.g., after applying the `compute_normals` function, the attributes may include `normals` that may not exist before). Nonetheless, you could always use the `list_attributes` function to see the current attributes that you could currently access.
```python
# this command will show you a table with all the attributes
# that you could currently use.
whu3d.list_attributes()
```
You could simply get a specific attribute of all scenes by using `get_attribute` function.
```python
# this function will return a list of the attributes
attr = whu3d.get_attribute('coords')
```
### Data
You could access the data of a specific scene by using `whu3d.data[scene][attribute]`.
```python
xyz = whu3d.data['0414']['coords']
```
### Labels
Labels could also be directly accessed.
```python
semantics = whu3d.labels['0414']['semantics']
instances = whu3d.labels['0414']['instances']
```
If you have interpreted the labels by using `interprete_labels` function, you could also get interpreted labels.
```python
semantics = whu3d.interpreted_labels['0414']['semantics']
instances = whu3d.interpreted_labels['0414']['instances']
```
## Visualization
### Point cloud
You can visualize a specific scene or a list of scenes using the `vis` function. By default, this function will show both the point cloud and image frames, and the points are randomly sampled with sample_ratio = 0.01 for faster visualization. It will show color according to the height of the point if `color` is not specified, or you could choose a specific color, including intensity, normals, semantics, instances, and other features (some features should be computed first via whu3d functions if they do not exist, and you could use `whu3d.list_attributes()` to see the current attributes first).
```python
# This will show sampled points and images
whu3d.vis(scene='0414', type='pc', color='intensity')
# Show all the points
whu3d.vis(scene='0414', sample_ratio=1.0, type='pc', color='intensity')
# if you want to show normals, please set 'show_normals' to True
whu3d.vis(scene='0414', type='pc', color='normals', show_normals=True)
```
or you can use a remote visualization function that allows you to visualize the scene on your local machine if the script is run on a remote server.
```python
# This function should be used if you want to visualize points
# and the script is run on a remote machine.
whu3d.remote_vis(scene='0424', type='pc', color='intensity')
```
Before running the `remove_vis` function on your remote machine, you should start another ssh connection to your remote machine, and launch open3d on your local machine.
### Images
Similarly, you could use the `vis` function to see a series of images of a specific scene.
```python
whu3d.vis(scene='0414', type='img')
```
### BEV
[Will be available soon.]
### Renderings
[Will be available soon.]
### Labels
If you want to visualize the labels of semantics or instances, you must run the `interprete_labels` function first (please refer to the 'labels interpretation' section).
```python
# you should run this function first to interpret the labels
info, labels = whu3d.interprete_labels()
# you could visualize semantics with specified colors
whu3d.vis(scene='0414', type='pc', color='semantics')
# or you could visualize instances with random colors
whu3d.vis(scene='0414', type='pc', color='instances')
```
## Export
Note that all the `export` functions will export data to `self.data_path` by default and you should better not change it if you want to load it later via pywhu3d.
### Export data
You could export other formats of whu3d, including las, ply, numpy, pickle, h5py, image, et al, by just using the `export_[type]` function.
```python
scenes = ['0404', '0940']
whu3d.export_h5(output='.')
whu3d.export_images(output='.', scenes=scenes)
# this will export las to the '[self.data_path]/las' folder if
# output is not specified, you can also specify 'scenes'
whu3d.export_las()
```
If `scenes` is not specified, it will export all the scenes by default.
### Export labels
`export_labels` function could export raw labels or interpreted labels.
```python
# this will export '[scene].labels' files to your 'output' folder
whu3d.export_labels(output='./labels', scenes=scenes)
# whu3d.export_labels()
```
### Export statistics
You could also export detailed statistics of the data and label to excel by using the `export_statistics` function.
```python
whu3d.export_statistics(output='./whu3d_statistics.xlsx')
```
For the export of metrics, you could refer to the 'Evaluation' part.
### Custom export
You could use the `export` function to export a specified type of data.
```python
whu3d.export(output='', attribute='interpreted_labels')
```
## Labels interpretation
You could use the `interprete_labels` function to merge similar categories and remap the labels to consecutive numbers like 0, 1, 2, ...
```python
# this will interpret the labels and create the 'gt' attribute
whu3d.interprete_labels()
```
After applying this function, you could access the interpreted labels by using `whu3d.gt`. For more information, you could use the `get_label_map` function to see the interpretation table.
```python
# this will output a table showing the detailed information
# this only shows you the information of semantics
whu3d.get_label_map()
```
### Block division
If you want to divide the whole scene into rectangle blocks along XY plane, you could use `save_divided_blocks` function. This function will directly save the divided blocks into `.h5` file.
```python
# this will divide the scene into 10m * 10m blocks with 5m overlap$
whu3d.save_divided_blocks(out_dir='', num_points=4096, size=(10, 10), stride=5, threshold=100, show_points=False)
```
### Custom interpretation
If you could use your own file to interpret the labels, you should follow the steps:
Step1: Create `label_interpretion.json`. This file should include
```json
{
"sem_no_list_ins": "2, 3, 7",
"sem_label_mapping": [
{"175": "2"},
{"18": "5"}
]
}
```
`sem_no_list_ins` exclude the categories which should be not interpreted as instances;
`sem_label_mapping` specifies the mapping rules of semantic labels.
Step 2: Put the JSON file into the data root folder.
Step 3: Perform the `interprete_labels` function.
## Evaluation
The interpretation of predicted results should be consistent with that of the interpreted labels.
### Semantic segmentation evaluation
Or you could use the evaluation tool as in the 'instance segmentation evaluation' section, just by replacing the instance results with semantics.
### Instance segmentation evaluation
For instance segmentation evaluation, you should use our `evaluation.Evaluator` tool.
```python
# define an evaluator for evaluation
# preds is a list with num_scenes items:
# [scene_1_gt_arr, ..., scene_k_gt_arr]. Each item is a 2D
# array with shape (num_points, 2), of which the first column
# is semantic prediction and the second is instance prediction
# there are two ways to create an evaluator
# first way
evaluator = whu3d.create_evaluator(preds)
# second way
from pywhu3d.evluation import Evaluator
evaluator = Evaluator(whu3d, preds)
# then you could use evaluator functions
evaluator.compute_metrics()
```
You could get metrics, including:
- instance metrics: MUCov, MWCov, Pre, Rec, F1-score
- semantic metrics: oAcc, mAcc, mIoU
```python
print(evaluator.info)
print(evaluator.eval_list)
print(evaluator.eval_table)
```
You could also export evaluation results.
```python
# this will export an Excel file with detailed metrics
evaluator.export(output_dir='./')
```
### Custom evaluation
If you want to define a different list of ground truth labels instead of using the default labels, you could use `set_gt` function to set the ground truth labels
```python
from pywhu3d.evluation import Evaluator
evaluator = Evaluator(whu3d, preds)
# use this script to define your custom labels
# truths: a list of scenes [scene_1_gt_arr, ..., scene_k_gt_arr]
# gt_arr is a numpy array with shape (num_points, 2)
eval.set_gt(truths)
# then you could use evaluator functions
evaluator.compute_metrics()
```
# Custom dataset
You can also use the whu3d tool to customize your own dataset for all pywhu3d features simply by using the `format` function.
```python
data_root = '/data/datasets/you_custom_dataset'
scenes = ['scene1', 'scene2']
whu3d = WHU3D(data_root=data_root, data_type='mls', format='txt', scenes=scenes)
# this will format your data as whu3d format
# 'attributes' should be consistent with your input data
in_attributes = ['coords', 'semantics', 'instances', 'intensities']
whu3d.format(attributes=in_attributes)
```
After applying the `format` function, you could use all the features the whu3d tool provides just as the whu3d-dataset.
## Demo
This is a demo for preprocessing MLS dataset.
```python
from pywhu3d.tool import WHU3D
data_root = 'data/whu-dataset'
mls_scenes = ['0404', '0940']
# als_scenes = ['5033', '3922']
# whu3d = WHU3D(data_root=data_root, data_type='mls', format='txt')
whu3d = WHU3D(data_root=data_root, data_type='mls', format='h5', scenes=mls_scenes)
whu3d.norm_coords()
# self.compute_normals()
whu3d.interprete_labels()
whu3d.compute_normals(radius=0.8)
whu3d.save_divided_blocks(out_dir='', num_points=60000, size=(20, 20), stride=10, threshold=100, show_points=False)
```
# More
`pywhu3d` is a tool to manage the whu3d dataset, with limited ability to process the dataset (e.g., segmentation). But if you need more features for processing the outdoor scene dataset, you could refer to [well soon be available]. For more details about our dataset, please refer to our website.
|
davidberenstein1957/ultra_feedback_dutch_cleaned_helm_instruct | ---
dataset_info:
features:
- name: GEITje-7B-ultra
dtype: string
- name: TowerInstruct-13B-v0.1
dtype: string
- name: TowerInstruct-7B-v0.2
dtype: string
- name: geitje-7b-chat
dtype: string
- name: gpt-4-turbo
dtype: string
- name: llama-2-13b-chat-dutch
dtype: string
- name: prompt_english
dtype: string
- name: prompt
dtype: string
- name: labelling_model
dtype: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
dtype: string
- name: rating_Helpfulness_GEITje-7B-ultra
dtype: float64
- name: rationale_Helpfulness_GEITje-7B-ultra
dtype: string
- name: generations_Helpfulness_GEITje-7B-ultra
dtype: 'null'
- name: rating_Helpfulness_TowerInstruct-13B-v0.1
dtype: float64
- name: rationale_Helpfulness_TowerInstruct-13B-v0.1
dtype: string
- name: generations_Helpfulness_TowerInstruct-13B-v0.1
dtype: 'null'
- name: rating_Helpfulness_TowerInstruct-7B-v0.2
dtype: float64
- name: rationale_Helpfulness_TowerInstruct-7B-v0.2
dtype: string
- name: generations_Helpfulness_TowerInstruct-7B-v0.2
dtype: 'null'
- name: rating_Helpfulness_geitje-7b-chat
dtype: float64
- name: rationale_Helpfulness_geitje-7b-chat
dtype: string
- name: generations_Helpfulness_geitje-7b-chat
dtype: 'null'
- name: rating_Helpfulness_gpt-4-turbo
dtype: float64
- name: rationale_Helpfulness_gpt-4-turbo
dtype: string
- name: generations_Helpfulness_gpt-4-turbo
dtype: 'null'
- name: rating_Helpfulness_llama-2-13b-chat-dutch
dtype: float64
- name: rationale_Helpfulness_llama-2-13b-chat-dutch
dtype: string
- name: generations_Helpfulness_llama-2-13b-chat-dutch
dtype: 'null'
splits:
- name: train
num_bytes: 2389654
num_examples: 100
download_size: 1380173
dataset_size: 2389654
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ArasAyen/pc9Cap | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 265997109.0
num_examples: 302
download_size: 262523050
dataset_size: 265997109.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pc9Cap"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
griffin/cnn-diverse-gpt-3.5-summaries | ---
dataset_info:
features:
- name: id
dtype: string
- name: source
dtype: string
- name: source_edu_annotated
dtype: string
- name: reference
dtype: string
- name: candidates
list:
- name: method
dtype: string
- name: method_beam
dtype: int64
- name: prediction
dtype: string
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: vanilla_prompt
dtype: string
- name: pga_prompts
sequence: string
- name: pga_edu_extract_idxs
sequence:
sequence: int64
splits:
- name: train
num_bytes: 226053728
num_examples: 1000
download_size: 91791746
dataset_size: 226053728
---
# Dataset Card for "cnn-diverse-gpt-3.5-summaries"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Francesco/aerial-cows | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': aerial-cows
'1': cow
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: aerial-cows
tags:
- rf100
---
# Dataset Card for aerial-cows
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/aerial-cows
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
aerial-cows
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/aerial-cows
### Citation Information
```
@misc{ aerial-cows,
title = { aerial cows Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/aerial-cows } },
url = { https://universe.roboflow.com/object-detection/aerial-cows },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
NYTK/HuWNLI | ---
annotations_creators:
- found
language_creators:
- found
- expert-generated
language:
- hu
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- unknown
source_datasets:
- extended|other
task_categories:
- other
task_ids:
- coreference-resolution
pretty_name: HuWNLI
tags:
- structure-prediction
---
# Dataset Card for HuWNLI
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
[HuWNLI dataset](https://github.com/nytud/HuWNLI)
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
[lnnoemi](mailto:ligeti-nagy.noemi@nytud.hu)
### Dataset Summary
This is the dataset card for the Hungarian translation of the Winograd schemata formatted as an inference task. A Winograd schema is a pair of sentences that differ in only one or two words and that contain an ambiguity that is resolved in opposite ways in the two sentences and requires the use of world knowledge and reasoning for its resolution (Levesque et al. 2012). This dataset is also part of the Hungarian Language Understanding Evaluation Benchmark Kit [HuLU](hulu.nlp.nytud.hu). The corpus was created by translating and manually curating the original English Winograd schemata. The NLI format was created by replacing the ambiguous pronoun with each possible referent (the method is described in GLUE's paper, Wang et al. 2019). We extended the set of sentence pairs derived from the schemata by the translation of the sentence pairs that - together with the Winograd schema sentences - build up the WNLI dataset of GLUE.
### Languages
The BCP-47 code for Hungarian, the only represented language in this dataset, is hu-HU.
## Dataset Structure
### Data Instances
For each instance, there is an orig_id, an id, two sentences and a label.
An example:
```
{"orig_id": "4",
"id": "4",
"sentence1": "A férfi nem tudta felemelni a fiát, mert olyan nehéz volt.",
"sentence2": "A fia nehéz volt.",
"Label": "1"
}
```
### Data Fields
- orig_id: the original id of this sentence pair (more precisely, its English counterpart's) in GLUE's WNLI dataset;
- id: unique id of the instances;
- sentence1: the premise;
- sentence2: the hypothesis;
- label: "1" if sentence2 is entailed by sentence1, and "0" otherwise.
### Data Splits
The data is distributed in three splits: training set (562), development set (59) and test set (134). The splits follow GLUE's WNLI's splits but contain fewer instances as many sentence pairs had to be thrown away for being untranslatable to Hungarian. The train and the development set have been extended from nli sentence pairs formatted from the Hungarian translation of 6 Winograd schemata left out from the original WNLI dataset.
The test set's sentence pairs are translated from GLUE's WNLI's test set. This set was distributed without labels. 3 annotators annotated the Hungarian sentence pairs.
The test set of HuWNLI is also distributed without labels. To evaluate your model, please [contact us](mailto:ligeti-nagy.noemi@nytud.hu), or check [HuLU's website](hulu.nytud.hu) for an automatic evaluation (this feature is under construction at the moment).
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
The data is a translation of the English Winograd schemata and the additional sentence pairs of GLUE's WNLI. Each schema and sentence pair was translated by a human translator. Each schema was manually curated by a linguistic expert. The schemata were transformed into nli format by a linguistic expert.
During the adaption method, we found two erroneous labels in GLUE's WNLI's train set (id 347 and id 464). We corrected them in our dataset.
## Additional Information
Average human performance on the test set is 92,78% (accuracy).
### Licensing Information
HuWNLI is released under the Creative Commons Attribution-ShareAlike 4.0 International License.
### Citation Information
If you use this resource or any part of its documentation, please refer to:
Ligeti-Nagy, N., Héja, E., Laki, L. J., Takács, D., Yang, Z. Gy. and Váradi, T. (2023) Hát te mekkorát nőttél! - A HuLU első életéve új adatbázisokkal és webszolgáltatással \[Look at how much you have grown! - The first year of HuLU with new databases and with webservice\]. In: Berend, G., Gosztolya, G. and Vincze, V. (eds), XIX. Magyar Számítógépes Nyelvészeti Konferencia. Szeged, Szegedi Tudományegyetem, Informatikai Intézet. 217-230.
```
@inproceedings{ligetinagy2023hulu,
title={át te mekkorát nőttél! - A HuLU első életéve új adatbázisokkal és webszolgáltatással},
author={Ligeti-Nagy, N. and Héja, E. and Laki, L. J. and Takács, D. and Yang, Z. Gy. and Váradi, T.},
booktitle={XIX. Magyar Számítógépes Nyelvészeti Konferencia},
year={2023},
editors = {Berend, Gábor and Gosztolya, Gábor and Vincze, Veronika},
address = {Szeged},
publisher = {JATEPress},
pages = {217–230}
}
```
Ligeti-Nagy, N., Ferenczi, G., Héja, E., Jelencsik-Mátyus, K., Laki, L. J., Vadász, N., Yang, Z. Gy. and Váradi, T. (2022) HuLU: magyar nyelvű benchmark adatbázis kiépítése a neurális nyelvmodellek kiértékelése céljából \[HuLU: Hungarian benchmark dataset to evaluate neural language models\]. In: Berend, Gábor and Gosztolya, Gábor and Vincze, Veronika (eds), XVIII. Magyar Számítógépes Nyelvészeti Konferencia. JATEPress, Szeged. 431–446.
```
@inproceedings{ligetinagy2022hulu,
title={HuLU: magyar nyelvű benchmark adatbázis kiépítése a neurális nyelvmodellek kiértékelése céljából},
author={Ligeti-Nagy, N. and Ferenczi, G. and Héja, E. and Jelencsik-Mátyus, K. and Laki, L. J. and Vadász, N. and Yang, Z. Gy. and Váradi, T.},
booktitle={XVIII. Magyar Számítógépes Nyelvészeti Konferencia},
year={2022},
editors = {Berend, Gábor and Gosztolya, Gábor and Vincze, Veronika},
address = {Szeged},
publisher = {JATEPress},
pages = {431–446}
}
```
and to:
Levesque, Hector, Davis, Ernest, Morgenstern, Leora (2012) he winograd schema challenge. In: Thirteenth International Conference on the Principles of Knowledge Representation and Reasoning.
```
@inproceedings{levesque2012winograd,
title={The Winograd Schema Challenge},
author={Levesque, Hector and Davis, Ernest and Morgenstern, Leora},
booktitle={Thirteenth International Conference on the Principles of Knowledge Representation and Reasoning},
year={2012},
organization={Citeseer}
}
```
### Contributions
Thanks to [lnnoemi](https://github.com/lnnoemi) for adding this dataset. |
tianyang/repobench-c | ---
language_creators:
- found
license:
- cc-by-nc-nd-4.0
multilinguality:
- multilingual
pretty_name: RepoBench-Completion
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- document-retrieval
tags:
- code
size_categories:
- 100K<n<1M
---
# Dataset Card for RepoBench-C
## Dataset Description
- **Homepage:** https://github.com/Leolty/repobench
- **Paper:** https://arxiv.org/abs/2306.03091
## Dataset Summary
**RepoBench-C (Completion)** is a subtask of **RepoBench**([GitHub](https://github.com/Leolty/repobench), [arXiv](https://arxiv.org/abs/2306.03091)), focuing on the prediction of the next line of code, given in-file context (including several preceding lines and import statements), and cross-file context.
## Settings
- `cff`: short for cross_file_first, indicating the cross-file module in next line is first used in the current file.
- `cfr`: short for cross_file_random, indicating the cross-file module in next line is not first used in the current file.
- `if`: short for in_file, indicating the next line does not contain any cross-file module.
## Supported Tasks
- `python_cff`: python code prediction with cross-file-first setting.
- `python_cfr`: python code prediction with cross-file-random setting.
- `python_if`: python code prediction with in-file setting.
- `java_cff`: java code prediction with cross-file-first setting.
- `java_cfr`: java code prediction with cross-file-random setting.
- `java_if`: java code prediction with in-file setting.
## Loading Data
For example, if you want to load the `test` set to test your model on `Python` code prediction with `cff` setting, you can do the following:
```python
from datasets import load_dataset
dataset = load_dataset("tianyang/repobench-c", "python_cff", split="test")
```
> Note: The `split` argument is optional. If not provided, the entire dataset will be loaded.
## Dataset Structure
```json
{
"repo_name": "repository name of the data point",
"file_path": "path/to/file",
"context": "commented and concatenated cross-file context",
"import_statement": "all import statements in the file",
"code": "the code for next-line prediction",
"prompt": "cross-file context + import statements + in-file code",
"next_line": "the next line of the code"
}
```
## Licensing Information
CC BY-NC-ND 4.0
## Citation Information
```bibtex
@misc{liu2023repobench,
title={RepoBench: Benchmarking Repository-Level Code Auto-Completion Systems},
author={Tianyang Liu and Canwen Xu and Julian McAuley},
year={2023},
eprint={2306.03091},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Contributions
Thanks to [@Leolty](https://github.com/Leolty) for adding this dataset. |
vwxyzjn/ultrachat_200k_filtered_1708035667 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_token
sequence: int64
- name: query_reference_response
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_len
dtype: int64
- name: query_token_len
dtype: int64
- name: reference_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
splits:
- name: test_sft
num_bytes: 1982888370.9168758
num_examples: 22991
- name: train_sft
num_bytes: 17846869528.524822
num_examples: 206698
download_size: 3301659997
dataset_size: 19829757899.441696
---
# Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': False,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_query_length=3000,
max_sft_query_response_length=4000,
max_sft_response_length=1500,
max_rm_query_response_length=4500,
max_rm_response_length=1500),
'push_to_hub': True}
```
|
maxmyn/wholesome_greentext_110k | ---
language:
- en
license: other
size_categories:
- 100K<n<1M
task_categories:
- text-generation
pretty_name: 'Short Wholesome 4chan-style Greentext '
dataset_info:
features:
- name: greentexts
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 15193164
num_examples: 111320
download_size: 9449169
dataset_size: 15193164
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- casual
- internet-culture
---
License: from my side, you can do whatever you want. Though parts of this data was generated via OpenAI's chatGPT (using GPT-4 and GPT-3.5 Instruct) as well as GPT-3.5 via their API.
Their terms prohibit the development of competing models. I did not bother to read the terms further. Use at your own risk. Have fun :) |
Seanxh/twitter_dataset_1713193752 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 62566
num_examples: 146
download_size: 26873
dataset_size: 62566
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nlplabtdtu/ppl_100k_with_embeddings | ---
dataset_info:
features:
- name: text
dtype: string
- name: embedding
sequence: float64
splits:
- name: train
num_bytes: 2238113067
num_examples: 100000
download_size: 1411158379
dataset_size: 2238113067
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jed351/rthk_news | ---
language:
- zh
---
### RTHK News Dataset
(RTHK)[https://www.rthk.hk/] is a public broadcasting service under the Hong Kong Government according to (Wikipedia)[https://en.wikipedia.org/wiki/RTHK]
This dataset at the moment is obtained from exporting messages from their (telegram channel)[https://t.me/rthk_new_c],
which contains news since April 2018.
I will update this dataset with more data in the future. |
We-Want-GPU/yi-ko-DPO-noprompt-dataset | ---
dataset_info:
features:
- name: id
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 7250081
num_examples: 3826
download_size: 3600412
dataset_size: 7250081
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
seonglae/wikipedia_token | ---
dataset_info:
config_name: gpt-4
features:
- name: id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: token_length
dtype: int64
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 19998333901
num_examples: 6458670
download_size: 11604627673
dataset_size: 19998333901
configs:
- config_name: gpt-4
data_files:
- split: train
path: gpt-4/train-*
---
# Dataset Card for "wikipedia_token"
```ts
Token count {
'~1024': 5320881,
'1024~2048': 693911,
'2048~4096': 300935,
'4096~8192': 106221,
'8192~16384': 30611,
'16384~32768': 4812,
'32768~65536': 1253,
'65536~128000': 46,
'128000~': 0
}
Text count {
'0~1024': 2751539,
'1024~2048': 1310778,
'2048~4096': 1179150,
'4096~8192': 722101,
'8192~16384': 329062,
'16384~32768': 121237,
'32768~65536': 36894,
'65536~': 7909
}
Token percent {
'~1024': '82.38%',
'1024~2048': '10.74%',
'2048~4096': '4.66%',
'4096~8192': '1.64%',
'8192~16384': '0.47%',
'16384~32768': '0.07%',
'32768~65536': '0.02%',
'65536~128000': '0.00%',
'128000~': '0.00%'
}
Text percent {
'0~1024': '42.60%',
'1024~2048': '20.29%',
'2048~4096': '18.26%',
'4096~8192': '11.18%',
'8192~16384': '5.09%',
'16384~32768': '1.88%',
'32768~65536': '0.57%',
'65536~': '0.12%'
}
``` |
unum-cloud/ann-arxiv-2m | ---
license: apache-2.0
---
# 2M Title-Abstract Arxiv Pairs
- `title_abstract.tsv` data from [Cornell University Arxiv Dataset](https://www.kaggle.com/Cornell-University/arxiv), preprocessed and coverted to TSV.
- `title.e5-base-v2.fbin` is a binary file with [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) title embeddings.
- `abstract.e5-base-v2.fbin` is a binary file with [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) abstract embeddings.
|
open-llm-leaderboard/details_DrNicefellow__Mistral-3-from-Mixtral-8x7B-v0.1 | ---
pretty_name: Evaluation run of DrNicefellow/Mistral-3-from-Mixtral-8x7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DrNicefellow/Mistral-3-from-Mixtral-8x7B-v0.1](https://huggingface.co/DrNicefellow/Mistral-3-from-Mixtral-8x7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DrNicefellow__Mistral-3-from-Mixtral-8x7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T14:28:15.409639](https://huggingface.co/datasets/open-llm-leaderboard/details_DrNicefellow__Mistral-3-from-Mixtral-8x7B-v0.1/blob/main/results_2024-04-15T14-28-15.409639.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25665938109738223,\n\
\ \"acc_stderr\": 0.03078324218540209,\n \"acc_norm\": 0.2580411998975106,\n\
\ \"acc_norm_stderr\": 0.03160481806166394,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.01489627744104185,\n \"mc2\": 0.4819046042054843,\n\
\ \"mc2_stderr\": 0.016210606003522837\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23464163822525597,\n \"acc_stderr\": 0.012383873560768673,\n\
\ \"acc_norm\": 0.2935153583617747,\n \"acc_norm_stderr\": 0.013307250444941129\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2593108942441745,\n\
\ \"acc_stderr\": 0.004373608212561027,\n \"acc_norm\": 0.2658832901812388,\n\
\ \"acc_norm_stderr\": 0.0044089948686501\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.34074074074074073,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.026199808807561915,\n\
\ \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.026199808807561915\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2013888888888889,\n\
\ \"acc_stderr\": 0.0335364746971384,\n \"acc_norm\": 0.2013888888888889,\n\
\ \"acc_norm_stderr\": 0.0335364746971384\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.032147373020294696,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.032147373020294696\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.02655698211783874,\n\
\ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.02655698211783874\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3064516129032258,\n \"acc_stderr\": 0.026226485652553873,\n \"\
acc_norm\": 0.3064516129032258,\n \"acc_norm_stderr\": 0.026226485652553873\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.03158415324047709,\n\
\ \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.03158415324047709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.03161877917935409,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.03161877917935409\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.021840866990423077,\n\
\ \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.021840866990423077\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.029344572500634335,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.029344572500634335\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21467889908256882,\n \"acc_stderr\": 0.017604304149256483,\n \"\
acc_norm\": 0.21467889908256882,\n \"acc_norm_stderr\": 0.017604304149256483\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n\
\ \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n\
\ \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n\
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2914798206278027,\n\
\ \"acc_stderr\": 0.03050028317654591,\n \"acc_norm\": 0.2914798206278027,\n\
\ \"acc_norm_stderr\": 0.03050028317654591\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.21487603305785125,\n \"acc_stderr\": 0.03749492448709699,\n \"\
acc_norm\": 0.21487603305785125,\n \"acc_norm_stderr\": 0.03749492448709699\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.026035386098951292,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.026035386098951292\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654555,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925293,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925293\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.024404394928087873,\n\
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.024404394928087873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.02549425935069489,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.02549425935069489\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.19858156028368795,\n \"acc_stderr\": 0.02379830163794212,\n \
\ \"acc_norm\": 0.19858156028368795,\n \"acc_norm_stderr\": 0.02379830163794212\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n\
\ \"acc_stderr\": 0.011005971399927235,\n \"acc_norm\": 0.24641460234680573,\n\
\ \"acc_norm_stderr\": 0.011005971399927235\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132226,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132226\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22448979591836735,\n \"acc_stderr\": 0.026711430555538408,\n\
\ \"acc_norm\": 0.22448979591836735,\n \"acc_norm_stderr\": 0.026711430555538408\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\
\ \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.22885572139303484,\n\
\ \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\
\ \"acc_stderr\": 0.03410646614071856,\n \"acc_norm\": 0.25903614457831325,\n\
\ \"acc_norm_stderr\": 0.03410646614071856\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.01489627744104185,\n \"mc2\": 0.4819046042054843,\n\
\ \"mc2_stderr\": 0.016210606003522837\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4972375690607735,\n \"acc_stderr\": 0.014052271211616448\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/DrNicefellow/Mistral-3-from-Mixtral-8x7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|arc:challenge|25_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|gsm8k|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hellaswag|10_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-28-15.409639.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T14-28-15.409639.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- '**/details_harness|winogrande|5_2024-04-15T14-28-15.409639.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T14-28-15.409639.parquet'
- config_name: results
data_files:
- split: 2024_04_15T14_28_15.409639
path:
- results_2024-04-15T14-28-15.409639.parquet
- split: latest
path:
- results_2024-04-15T14-28-15.409639.parquet
---
# Dataset Card for Evaluation run of DrNicefellow/Mistral-3-from-Mixtral-8x7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DrNicefellow/Mistral-3-from-Mixtral-8x7B-v0.1](https://huggingface.co/DrNicefellow/Mistral-3-from-Mixtral-8x7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DrNicefellow__Mistral-3-from-Mixtral-8x7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T14:28:15.409639](https://huggingface.co/datasets/open-llm-leaderboard/details_DrNicefellow__Mistral-3-from-Mixtral-8x7B-v0.1/blob/main/results_2024-04-15T14-28-15.409639.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25665938109738223,
"acc_stderr": 0.03078324218540209,
"acc_norm": 0.2580411998975106,
"acc_norm_stderr": 0.03160481806166394,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104185,
"mc2": 0.4819046042054843,
"mc2_stderr": 0.016210606003522837
},
"harness|arc:challenge|25": {
"acc": 0.23464163822525597,
"acc_stderr": 0.012383873560768673,
"acc_norm": 0.2935153583617747,
"acc_norm_stderr": 0.013307250444941129
},
"harness|hellaswag|10": {
"acc": 0.2593108942441745,
"acc_stderr": 0.004373608212561027,
"acc_norm": 0.2658832901812388,
"acc_norm_stderr": 0.0044089948686501
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.026199808807561915,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.026199808807561915
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2013888888888889,
"acc_stderr": 0.0335364746971384,
"acc_norm": 0.2013888888888889,
"acc_norm_stderr": 0.0335364746971384
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.032147373020294696,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.032147373020294696
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.02655698211783874,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.02655698211783874
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3064516129032258,
"acc_stderr": 0.026226485652553873,
"acc_norm": 0.3064516129032258,
"acc_norm_stderr": 0.026226485652553873
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427496,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427496
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.03161877917935409,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.03161877917935409
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.021840866990423077,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.021840866990423077
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.029344572500634335,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.029344572500634335
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21467889908256882,
"acc_stderr": 0.017604304149256483,
"acc_norm": 0.21467889908256882,
"acc_norm_stderr": 0.017604304149256483
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2914798206278027,
"acc_stderr": 0.03050028317654591,
"acc_norm": 0.2914798206278027,
"acc_norm_stderr": 0.03050028317654591
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.21487603305785125,
"acc_stderr": 0.03749492448709699,
"acc_norm": 0.21487603305785125,
"acc_norm_stderr": 0.03749492448709699
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.1650485436893204,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.1650485436893204,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.026035386098951292,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.026035386098951292
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654555,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925293,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925293
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.024404394928087873,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.024404394928087873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.02549425935069489,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.02549425935069489
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.19858156028368795,
"acc_stderr": 0.02379830163794212,
"acc_norm": 0.19858156028368795,
"acc_norm_stderr": 0.02379830163794212
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.011005971399927235,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.011005971399927235
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.01755581809132226,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.01755581809132226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22448979591836735,
"acc_stderr": 0.026711430555538408,
"acc_norm": 0.22448979591836735,
"acc_norm_stderr": 0.026711430555538408
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071856,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071856
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104185,
"mc2": 0.4819046042054843,
"mc2_stderr": 0.016210606003522837
},
"harness|winogrande|5": {
"acc": 0.4972375690607735,
"acc_stderr": 0.014052271211616448
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MaryamAlAli/Mixat_all_draft | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 7778974960.076
num_examples: 5316
download_size: 8055800680
dataset_size: 7778974960.076
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Mixat_All"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sinarashidi/sentiment-analysis | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 44032051
num_examples: 128432
download_size: 19743452
dataset_size: 44032051
---
# Dataset Card for "sentiment-analysis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713112140 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2322347
num_examples: 7151
download_size: 1322995
dataset_size: 2322347
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
flwrlabs/shakespeare | ---
license: bsd-2-clause
task_categories:
- text-generation
language:
- en
size_categories:
- 1M<n<10M
configs:
- config_name: default
data_files:
- split: train
path: "shakespeare.csv"
---
# Dataset Card for Dataset Name
This dataset is a part of the [LEAF](https://leaf.cmu.edu/) benchmark.
The Shakespeare dataset is built from [The Complete Works of William Shakespeare](https://www.gutenberg.org/ebooks/100) with the goal of the next character prediction.
## Dataset Details
### Dataset Description
Each sample is comprised of a text of 80 characters (x) and a next character (y).
- **Curated by:** [LEAF](https://leaf.cmu.edu/)
- **Language(s) (NLP):** English
- **License:** BSD 2-Clause License
### Dataset Sources
The code from the original repository was adopted to post it here.
- **Repository:** https://github.com/TalwalkarLab/leaf
- **Paper:** https://arxiv.org/abs/1812.01097
## Uses
This dataset is intended to be used in Federated Learning settings.
A pair of a character and a play denotes a unique user in the federation.
### Direct Use
This dataset is designed to be used in FL settings. We recommend using [Flower Dataset](https://flower.ai/docs/datasets/) (flwr-datasets) and [Flower](https://flower.ai/docs/framework/) (flwr).
To partition the dataset, do the following.
1. Install the package.
```bash
pip install flwr-datasets
```
2. Use the HF Dataset under the hood in Flower Datasets.
```python
from flwr_datasets import FederatedDataset
from flwr_datasets.partitioner import NaturalIdPartitioner
fds = FederatedDataset(
dataset="flwrlabs/shakespeare",
partitioners={"train": NaturalIdPartitioner(partition_by="character_id")}
)
partition = fds.load_partition(node_id=0)
```
## Dataset Structure
The dataset contains only train split. The split in the paper happens at each node only (no centralized dataset).
The dataset is comprised of columns:
* `character_id`: str - id denoting a pair of character + play (node in federated learning settings)
* `x`: str - text of 80 characters
* `y`: str - single character following the `x`
Please note that the data is temporal. Therefore, caution is needed when dividing it so as not to leak the information from the train set.
## Dataset Creation
### Curation Rationale
This dataset was created as a part of the [LEAF](https://leaf.cmu.edu/) benchmark.
### Source Data
[The Complete Works of William Shakespeare](https://www.gutenberg.org/ebooks/100)
#### Data Collection and Processing
For the preprocessing details, please refer to the original paper and the source code.
#### Who are the source data producers?
William Shakespeare
## Citation
When working on the LEAF benchmark, please cite the original paper. If you're using this dataset with Flower Datasets, you can cite Flower.
**BibTeX:**
```
@article{DBLP:journals/corr/abs-1812-01097,
author = {Sebastian Caldas and
Peter Wu and
Tian Li and
Jakub Kone{\v{c}}n{\'y} and
H. Brendan McMahan and
Virginia Smith and
Ameet Talwalkar},
title = {{LEAF:} {A} Benchmark for Federated Settings},
journal = {CoRR},
volume = {abs/1812.01097},
year = {2018},
url = {http://arxiv.org/abs/1812.01097},
eprinttype = {arXiv},
eprint = {1812.01097},
timestamp = {Wed, 23 Dec 2020 09:35:18 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1812-01097.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
```
@article{DBLP:journals/corr/abs-2007-14390,
author = {Daniel J. Beutel and
Taner Topal and
Akhil Mathur and
Xinchi Qiu and
Titouan Parcollet and
Nicholas D. Lane},
title = {Flower: {A} Friendly Federated Learning Research Framework},
journal = {CoRR},
volume = {abs/2007.14390},
year = {2020},
url = {https://arxiv.org/abs/2007.14390},
eprinttype = {arXiv},
eprint = {2007.14390},
timestamp = {Mon, 03 Aug 2020 14:32:13 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2007-14390.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
## Dataset Card Contact
In case of any doubts, please contact [Flower Labs](https://flower.ai/). |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-43500 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 6412283410
num_examples: 1000
download_size: 1260040862
dataset_size: 6412283410
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kastan/stormfront | ---
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 7548306952
num_examples: 10458223
- name: test
num_bytes: 386917
num_examples: 791
download_size: 4688723070
dataset_size: 7548693869
---
# Dataset Card for "stormfront"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cloneofsimo/GeneratedImageOfCelebs | ---
license: bigscience-openrail-m
---
|
NaNames/VitsPerTrainModel | ---
license: openrail
---
|
tdh87/MixedCOntentV3 | ---
license: apache-2.0
---
|
FreedomIntelligence/2023_Pharmacist_Licensure_Examination-Pharmacy_track | ---
license: apache-2.0
---
The 2023 Chinese National Pharmacist Licensure Examination is divided into two distinct tracks: the Pharmacy track and the Traditional Chinese Medicine (TCM) Pharmacy track. The data provided here pertains to the Pharmacy track examination. It is important to note that this dataset was collected from online sources, and there may be some discrepancies between this data and the actual examination.
- **Repository:** https://github.com/FreedomIntelligence/HuatuoGPT-II |
mayur456/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ZenMoore/RoleBench | ---
language:
- zh
- en
pretty_name: "RoleBench"
tags:
- Role-Playing
- Instruction
license: "apache-2.0"
---
# RoleBench
- Paper Title: RoleLLM: Benchmarking, Eliciting, and Enhancing Role-Playing Abilities of Large Language Models
- arXiv Link: https://arxiv.org/abs/2310.00746
- Github Repo: https://github.com/InteractiveNLP-Team/RoleLLM-public
Please read our paper for more details about this dataset.
TL;DR: We introduce RoleLLM, a role-playing framework of data construction and evaluation (RoleBench), as well as solutions for both closed-source and open-source models (RoleGPT, RoleLLaMA, RoleGLM). We also propose Context-Instruct for long-text knowledge extraction and role-specific knowledge injection.
---
# List of Roles

Abraham Lincoln, Alvy Singer, Andrew Detmer, Angel, Antonio Salieri, Bai Li (李白,Chinese), Benjamin Button, Blair Waldorf, Bruno Antony, Caden Cotard, Caesar, Coach Eric Taylor, Colonel Hans Landa, Colonel Nathan R. Jessep, Coriolanus, D_Artagnan, David Aames, Doctor Who, Dr. Frank N Furter, Dr. Hannibal Lecter, Emperor (《甄嬛传》皇帝,Chinese), Fei Zhang (张飞,Chinese), Fletcher Reede, Frank T.J. Mackey, Fred Flintstone, Freddy Krueger, Gaston, Gregory House, HAL 9000, Harvey Milk, Imperial Concubine Hua (《甄嬛传》华妃,Chinese), Jack, Jack Sparrow, Jack Torrance, Jackie Moon, James Bond, James Brown, James Carter, Jeff Spicoli, Jigsaw, Jim Morrison, John Coffey, John Dillinger, John Doe, John Keating, Jordan Belfort, Judge Dredd, Judy Hoops, Juno MacGuff, Karl Childers, Klaus Mikaelson, Leonard Shelby, Leroy Jethro Gibbs, Lestat de Lioncourt, Logan, Lucifer Morningstar, Lyn Cassady, Malcolm X, Mark Renton, Mary Sibley, Mater, Michael Scott, Murphy MacManus, Oliver Queen, Pat Solitano, Paul Conroy, Paul Vitti, Peter Parker, Po, Professor G.H. Dorr, Queen Catherine, Queen Elizabeth I, Rachel Lang, Randle McMurphy, Raylan Givens, Robert Angier, Rorschach, Seth, Sheldon Cooper, Sherlock Holmes, Shrek, Sonny, Stanley Ipkiss, Stephen Hawking, Stifler, The Dude, Theodore Twombly, Thor, Tom Ripley, Travis Bickle, Truman Capote, Tugg Speedman, Twilight Sparkle, Tyler Hawkins, Tyrion Lannister, Violet Weston, Wade Wilson, Walt Kowalski, Willie Soke, Wukong Sun (《西游记》孙悟空,Chinese).
---
# Non-Cherry-Picked Demonstrations




---
# Statistics


---
# Download
```bash
git lfs install
git clone https://huggingface.co/datasets/ZenMoore/RoleBench
```
```python
from datasets import load_dataset
dataset = load_dataset("ZenMoore/RoleBench")
```
---
# File Structure
- `instructions-eng`: Contains English Instructions (both general and role-specific ones). `nums.jsonl` indicates the number of role-specific instructions for each role, while `split_info.txt` records how many segments each role's script can be divided into during the Context-Instruct.
- `instructions-zh`: Similarly for Chinese.
- `profiles-eng`: Contains the description file `desc.json` for all roles, dialogue data files `profiles-eng-{role_name}.jsonl` for each role, and the script names in `scripts.json`.
- `profiles-zh`: Similarly for Chinese.
- `rolebench-eng/instruction-generalization`, `rolebench-eng/role-generalization`, and `rolebench-zh`: All contain two subfolders: `general` and `role_specific`. Each subfolder has training data, testing data, and the RoleGPT baseline results for comparison.
---
# License
Apache 2.0 License.
---
# Citation
Feel free to cite us if you like RoleBench and RoleLLM.
```bibtex
@article{wang2023rolellm,
title = {RoleLLM: Benchmarking, Eliciting, and Enhancing Role-Playing Abilities of Large Language Models},
author = {Zekun Moore Wang and Zhongyuan Peng and Haoran Que and Jiaheng Liu and Wangchunshu Zhou and Yuhan Wu and Hongcheng Guo and Ruitong Gan and Zehao Ni and Man Zhang and Zhaoxiang Zhang and Wanli Ouyang and Ke Xu and Wenhu Chen and Jie Fu and Junran Peng},
year = {2023},
journal = {arXiv preprint arXiv: 2310.00746}
}
```
```bibtex
@article{wang2023interactive,
title={Interactive Natural Language Processing},
author={Wang, Zekun and Zhang, Ge and Yang, Kexin and Shi, Ning and Zhou, Wangchunshu and Hao, Shaochun and Xiong, Guangzheng and Li, Yizhi and Sim, Mong Yuan and Chen, Xiuying and others},
journal={arXiv preprint arXiv:2305.13246},
year={2023}
}
``` |
p1atdev/noz | ---
license: cc0-1.0
---
|
plaguss/the_office_dialogs | ---
license: mit
language:
- en
tags:
- art
pretty_name: the_office_dialogs
size_categories:
- 10K<n<100K
splits:
- name: train
---
*This dataset is under construction*.
It contains the dialogs from [The Office](https://en.wikipedia.org/wiki/The_Office_(American_TV_series)).
Obtained from [this repo](https://github.com/brianbuie/the-office). |
Seanxh/twitter_dataset_1713227313 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 252129
num_examples: 580
download_size: 80239
dataset_size: 252129
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Broomva/instruct-reduced-spa-guc | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4410827
num_examples: 10000
download_size: 2449083
dataset_size: 4410827
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EarthnDusk/PorcelainDuskMix | ---
license: creativeml-openrail-m
---
|
DaisyStar004/iCliniq-llama2-7k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7229044
num_examples: 7000
download_size: 4177341
dataset_size: 7229044
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "iCliniq-llama2-7k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gagan3012/dolphin-retrival-DAWQS-QA-qrels | ---
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int32
splits:
- name: test
num_bytes: 6895
num_examples: 318
download_size: 3576
dataset_size: 6895
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
Softage-AI/vqa-tools_dataset | ---
license: mit
language:
- en
---
## VQA Tool-based Dataset
Description:
This dataset offers 12 question-answer pairs for tools like Airbnb, Blender, Excel, and many more. Each prompt links an image of the tool's interface with a user's question and a corresponding answer explaining how to complete the action. This dataset, though limited in its size and scope, serves as an illustration of SoftAge's capabilities in the domain of Visual Question Answering (VQA) for training AI agents.
## Data attributes
- Tool/Software: Name of the tool or software (string)
- Screenshot Url: Link to the image representing the user’s problem (string)
- Prompt: User's question about the tool's functionality (string)
- Response (Formal tone & Professional tone: Answering the prompt in a different tone, explaining how to perform the action (string)
- Citations: Multiple links to provide references, used for generating the response to the prompt.
## Dataset Source
This dataset is curated by the delivery team @SoftAge
## Limitations and Biases
- Limited size (12 samples) might not cover the full range of functionalities for each tool or software.
- The chosen tools and questions might reflect specific user interests or focus areas.
- The answer or response might not address all the potential complexities of the task.
## Potential Uses:
Training VQA models to understand and answer user questions about different software functionalities based on visuals. |
CMPG313/absalom_voice_dataset | ---
dataset_info:
features:
- name: file_name
dtype: string
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 29597565.792
num_examples: 3628
download_size: 69397717
dataset_size: 29597565.792
---
# Dataset Card for "absalom_voice_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/c2696aec | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 169
num_examples: 10
download_size: 1324
dataset_size: 169
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "c2696aec"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aengusl/noise0_alpaca_sleeper_agents_toy_train_v4 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5443360
num_examples: 15661
download_size: 2524561
dataset_size: 5443360
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
plncmm/wl-family-member | ---
license: cc-by-nc-4.0
---
|
deepghs/quality_rlhf | ---
license: openrail
task_categories:
- reinforcement-learning
tags:
- art
- not-for-all-audiences
--- |
qazisaad/rw_processed_ds | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: float64
splits:
- name: train
num_bytes: 79056000
num_examples: 16200
- name: test
num_bytes: 8784000
num_examples: 1800
download_size: 16937368
dataset_size: 87840000
---
# Dataset Card for "rw_processed_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
metinovadilet/copy_of_alpaca_kr | ---
license: apache-2.0
---
|
heliosprime/twitter_dataset_1713063641 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13305
num_examples: 29
download_size: 9763
dataset_size: 13305
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713063641"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ywnl/disney_images | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Animals:Objects
'1': Characters
'2': Landscapes
splits:
- name: train
num_bytes: 24890929.0
num_examples: 102
download_size: 24892969
dataset_size: 24890929.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PhilKey/llama2-openrewrite | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1491132
num_examples: 255
download_size: 373032
dataset_size: 1491132
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_jebcarter__psyonic-cetacean-20B | ---
pretty_name: Evaluation run of jebcarter/psyonic-cetacean-20B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jebcarter/psyonic-cetacean-20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jebcarter__psyonic-cetacean-20B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T20:41:51.584700](https://huggingface.co/datasets/open-llm-leaderboard/details_jebcarter__psyonic-cetacean-20B/blob/main/results_2023-12-04T20-41-51.584700.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5935200283760108,\n\
\ \"acc_stderr\": 0.03289023551450696,\n \"acc_norm\": 0.6017961208576313,\n\
\ \"acc_norm_stderr\": 0.03361696714318325,\n \"mc1\": 0.397796817625459,\n\
\ \"mc1_stderr\": 0.01713393424855964,\n \"mc2\": 0.5754737295645932,\n\
\ \"mc2_stderr\": 0.01561942525764945\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5895904436860068,\n \"acc_stderr\": 0.014374922192642664,\n\
\ \"acc_norm\": 0.6356655290102389,\n \"acc_norm_stderr\": 0.014063260279882419\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6783509261103365,\n\
\ \"acc_stderr\": 0.0046615449915830345,\n \"acc_norm\": 0.861979685321649,\n\
\ \"acc_norm_stderr\": 0.0034421638433628794\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.029300101705549655,\n\
\ \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.029300101705549655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646796,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646796\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336937,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336937\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36507936507936506,\n \"acc_stderr\": 0.024796060602699968,\n \"\
acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.024796060602699968\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n\
\ \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n\
\ \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110946,\n\
\ \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110946\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787572,\n \"\
acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787572\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.032596251184168284,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.032596251184168284\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077812,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077812\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7879948914431673,\n\
\ \"acc_stderr\": 0.014616099385833688,\n \"acc_norm\": 0.7879948914431673,\n\
\ \"acc_norm_stderr\": 0.014616099385833688\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.02494679222527231,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.02494679222527231\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22346368715083798,\n\
\ \"acc_stderr\": 0.013932068638579773,\n \"acc_norm\": 0.22346368715083798,\n\
\ \"acc_norm_stderr\": 0.013932068638579773\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.02736359328468497,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.02736359328468497\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889017,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889017\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n\
\ \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573702,\n \
\ \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573702\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n\
\ \"mc1_stderr\": 0.01713393424855964,\n \"mc2\": 0.5754737295645932,\n\
\ \"mc2_stderr\": 0.01561942525764945\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.01161619821577323\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1470811220621683,\n \
\ \"acc_stderr\": 0.009756063660359868\n }\n}\n```"
repo_url: https://huggingface.co/jebcarter/psyonic-cetacean-20B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|arc:challenge|25_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|gsm8k|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hellaswag|10_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T20-41-51.584700.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T20-41-51.584700.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- '**/details_harness|winogrande|5_2023-12-04T20-41-51.584700.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T20-41-51.584700.parquet'
- config_name: results
data_files:
- split: 2023_12_04T20_41_51.584700
path:
- results_2023-12-04T20-41-51.584700.parquet
- split: latest
path:
- results_2023-12-04T20-41-51.584700.parquet
---
# Dataset Card for Evaluation run of jebcarter/psyonic-cetacean-20B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jebcarter/psyonic-cetacean-20B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jebcarter/psyonic-cetacean-20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jebcarter__psyonic-cetacean-20B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T20:41:51.584700](https://huggingface.co/datasets/open-llm-leaderboard/details_jebcarter__psyonic-cetacean-20B/blob/main/results_2023-12-04T20-41-51.584700.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5935200283760108,
"acc_stderr": 0.03289023551450696,
"acc_norm": 0.6017961208576313,
"acc_norm_stderr": 0.03361696714318325,
"mc1": 0.397796817625459,
"mc1_stderr": 0.01713393424855964,
"mc2": 0.5754737295645932,
"mc2_stderr": 0.01561942525764945
},
"harness|arc:challenge|25": {
"acc": 0.5895904436860068,
"acc_stderr": 0.014374922192642664,
"acc_norm": 0.6356655290102389,
"acc_norm_stderr": 0.014063260279882419
},
"harness|hellaswag|10": {
"acc": 0.6783509261103365,
"acc_stderr": 0.0046615449915830345,
"acc_norm": 0.861979685321649,
"acc_norm_stderr": 0.0034421638433628794
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.029300101705549655,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.029300101705549655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646796,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646796
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336937,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336937
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.024796060602699968,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.024796060602699968
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110946,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110946
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787572,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787572
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.032596251184168284,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.032596251184168284
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077812,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077812
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7879948914431673,
"acc_stderr": 0.014616099385833688,
"acc_norm": 0.7879948914431673,
"acc_norm_stderr": 0.014616099385833688
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.02494679222527231,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.02494679222527231
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22346368715083798,
"acc_stderr": 0.013932068638579773,
"acc_norm": 0.22346368715083798,
"acc_norm_stderr": 0.013932068638579773
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.02736359328468497,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.02736359328468497
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889017,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889017
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573702,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573702
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.397796817625459,
"mc1_stderr": 0.01713393424855964,
"mc2": 0.5754737295645932,
"mc2_stderr": 0.01561942525764945
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.01161619821577323
},
"harness|gsm8k|5": {
"acc": 0.1470811220621683,
"acc_stderr": 0.009756063660359868
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Back-up/chung-khoan-v2-3-final | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: date_comment
dtype: string
- name: res
dtype: string
splits:
- name: train
num_bytes: 250939941
num_examples: 52461
download_size: 88965974
dataset_size: 250939941
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liyongsea/ptb-sss | ---
dataset_info:
features:
- name: ecg_id
dtype: int64
- name: age
dtype: int32
- name: sex
dtype: string
- name: ecg_array
dtype:
array2_d:
shape:
- 5000
- 12
dtype: float32
- name: idx
dtype: int64
splits:
- name: train
num_bytes: 2600290
num_examples: 10
download_size: 914715
dataset_size: 2600290
---
# Dataset Card for "ptb-sss"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gosshh/eurosat | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': AnnualCrop
'1': Forest
'2': HerbaceousVegetation
'3': Highway
'4': Industrial
'5': Pasture
'6': PermanentCrop
'7': Residential
'8': River
'9': SeaLake
splits:
- name: train
num_bytes: 88397609.0
num_examples: 27000
download_size: 91979105
dataset_size: 88397609.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibranze/araproje_hellaswag_en_conf1 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149738.0
num_examples: 250
- name: dev
num_bytes: 5989.52
num_examples: 10
download_size: 91075
dataset_size: 155727.52
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: dev
path: data/dev-*
---
# Dataset Card for "araproje_hellaswag_en_conf1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rafacost/kasparov_whites | ---
license: llama2
---
|
FarReelAILab/Machine_Mindset_MBTI_dataset | ---
unknown: null
license: apache-2.0
---
Here are the ***behavior datasets*** used for supervised fine-tuning (SFT). And they can also be used for direct preference optimization (DPO).
The exact copy can also be found in [Github](https://github.com/PKU-YuanGroup/Machine-Mindset/edit/main/datasets/behaviour).
Prefix ***'en'*** denotes the datasets of the English version.
Prefix ***'zh'*** denotes the datasets of the Chinese version.
## Dataset introduction
There are four dimension in MBTI. And there are two opposite attributes within each dimension.
To be specific:
+ Energe: Extraversion (E) - Introversion (I)
+ Information: Sensing (S) - Intuition (N)
+ Decision: Thinking (T) - Feeling (F)
+ Execution: Judging (J) - Perceiving (P)
Based on the above, you can infer the content of the json file from its name.
The datasets follow the Alpaca format, consisting of instruction, input and output.
## How to use these datasets for behavior supervised fine-tuning (SFT)
For example, if you want to make an LLM behave like an ***ISFJ***, you need to select ***the four corresponding files*** (en_energe_introversion.json, en_information_sensing.json, en_decision_feeling.json, en_execution_judging.json).
And use the four for SFT.
## How to use these datasets for direct preference optimization (DPO)
For example, if you want to make an LLM be ***more feeling (F) than thinking (T)*** by DPO, you need to select ***the two corresponding files*** (en_decision_feeling.json, en_decision_thinking.json).
And then compile the two into the correct format for DPO. For the correct format, please refer to [this](https://github.com/PKU-YuanGroup/Machine-Mindset/blob/main/datasets/dpo/README.md).
|
DAMO-NLP-SG/MultiJail | ---
license: mit
task_categories:
- conversational
language:
- en
- zh
- it
- vi
- ar
- ko
- th
- bn
- sw
- jv
size_categories:
- n<1K
---
# Multilingual Jailbreak Challenges in Large Language Models
This repo contains the data for our paper ["Multilingual Jailbreak Challenges in Large Language Models"](https://arxiv.org/abs/2310.06474).
[[Github repo]](https://github.com/DAMO-NLP-SG/multilingual-safety-for-LLMs/)
## Annotation Statistics
We collected a total of 315 English unsafe prompts and annotated them into nine non-English languages. The languages were categorized based on resource availability, as shown below:
**High-resource languages:** Chinese (zh), Italian (it), Vietnamese (vi)
**Medium-resource languages:** Arabic (ar), Korean (ko), Thai (th)
**Low-resource languages:** Bengali (bn), Swahili (sw), Javanese (jv)
## Ethics Statement
Our research investigates the safety challenges of LLMs in multilingual settings. We are aware of the potential misuse of our findings and emphasize that our research is solely for academic purposes and ethical use. Misuse or harm resulting from the information in this paper is strongly discouraged. To address the identified risks and vulnerabilities, we commit to open-sourcing the data used in our study. This openness aims to facilitate vulnerability identification, encourage discussions, and foster collaborative efforts to enhance LLM safety in multilingual contexts. Furthermore, we have developed the SELF-DEFENSE framework to address multilingual jailbreak challenges in LLMs. This framework automatically generates multilingual safety training data to mitigate risks associated with unintentional and intentional jailbreak scenarios. Overall, our work not only highlights multilingual jailbreak challenges in LLMs but also paves the way for future research, collaboration, and innovation to enhance their safety.
## Citation
```
@misc{deng2023multilingual,
title={Multilingual Jailbreak Challenges in Large Language Models},
author={Yue Deng and Wenxuan Zhang and Sinno Jialin Pan and Lidong Bing},
year={2023},
eprint={2310.06474},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
open-llm-leaderboard/details_sail__Sailor-4B | ---
pretty_name: Evaluation run of sail/Sailor-4B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sail/Sailor-4B](https://huggingface.co/sail/Sailor-4B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sail__Sailor-4B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-03T06:29:07.816855](https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-4B/blob/main/results_2024-03-03T06-29-07.816855.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.37703264298169736,\n\
\ \"acc_stderr\": 0.03416862166048836,\n \"acc_norm\": 0.38101337565531157,\n\
\ \"acc_norm_stderr\": 0.034964297422117964,\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.01476194517486267,\n \"mc2\": 0.37017660840801425,\n\
\ \"mc2_stderr\": 0.013722897185973262\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4061433447098976,\n \"acc_stderr\": 0.014351656690097858,\n\
\ \"acc_norm\": 0.43856655290102387,\n \"acc_norm_stderr\": 0.014500682618212864\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5020912168890659,\n\
\ \"acc_stderr\": 0.004989737768749948,\n \"acc_norm\": 0.6950806612228639,\n\
\ \"acc_norm_stderr\": 0.004594323838650353\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3660377358490566,\n \"acc_stderr\": 0.02964781353936525,\n\
\ \"acc_norm\": 0.3660377358490566,\n \"acc_norm_stderr\": 0.02964781353936525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2947976878612717,\n\
\ \"acc_stderr\": 0.03476599607516477,\n \"acc_norm\": 0.2947976878612717,\n\
\ \"acc_norm_stderr\": 0.03476599607516477\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101737,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101737\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047736,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047736\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.38387096774193546,\n \"acc_stderr\": 0.027666182075539645,\n \"\
acc_norm\": 0.38387096774193546,\n \"acc_norm_stderr\": 0.027666182075539645\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n \"\
acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3696969696969697,\n \"acc_stderr\": 0.03769430314512568,\n\
\ \"acc_norm\": 0.3696969696969697,\n \"acc_norm_stderr\": 0.03769430314512568\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.43434343434343436,\n \"acc_stderr\": 0.03531505879359182,\n \"\
acc_norm\": 0.43434343434343436,\n \"acc_norm_stderr\": 0.03531505879359182\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.43005181347150256,\n \"acc_stderr\": 0.035729543331448066,\n\
\ \"acc_norm\": 0.43005181347150256,\n \"acc_norm_stderr\": 0.035729543331448066\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.33589743589743587,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.33589743589743587,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712177,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712177\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.03068473711513536,\n\
\ \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.43669724770642204,\n \"acc_stderr\": 0.021264820158714212,\n \"\
acc_norm\": 0.43669724770642204,\n \"acc_norm_stderr\": 0.021264820158714212\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.03275773486100999,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.03275773486100999\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4068627450980392,\n \"acc_stderr\": 0.03447891136353382,\n \"\
acc_norm\": 0.4068627450980392,\n \"acc_norm_stderr\": 0.03447891136353382\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4092827004219409,\n \"acc_stderr\": 0.032007041833595914,\n \
\ \"acc_norm\": 0.4092827004219409,\n \"acc_norm_stderr\": 0.032007041833595914\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47533632286995514,\n\
\ \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.47533632286995514,\n\
\ \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.37404580152671757,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.37404580152671757,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4793388429752066,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.4793388429752066,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3496932515337423,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.3496932515337423,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258974,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5170940170940171,\n\
\ \"acc_stderr\": 0.032736940493481824,\n \"acc_norm\": 0.5170940170940171,\n\
\ \"acc_norm_stderr\": 0.032736940493481824\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4661558109833972,\n\
\ \"acc_stderr\": 0.017838956009136805,\n \"acc_norm\": 0.4661558109833972,\n\
\ \"acc_norm_stderr\": 0.017838956009136805\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.37283236994219654,\n \"acc_stderr\": 0.026033890613576277,\n\
\ \"acc_norm\": 0.37283236994219654,\n \"acc_norm_stderr\": 0.026033890613576277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372428,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372428\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4150326797385621,\n \"acc_stderr\": 0.028213504177824093,\n\
\ \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.028213504177824093\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4694533762057878,\n\
\ \"acc_stderr\": 0.02834504586484068,\n \"acc_norm\": 0.4694533762057878,\n\
\ \"acc_norm_stderr\": 0.02834504586484068\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.404320987654321,\n \"acc_stderr\": 0.02730662529732768,\n\
\ \"acc_norm\": 0.404320987654321,\n \"acc_norm_stderr\": 0.02730662529732768\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28748370273794005,\n\
\ \"acc_stderr\": 0.011559337355708512,\n \"acc_norm\": 0.28748370273794005,\n\
\ \"acc_norm_stderr\": 0.011559337355708512\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3161764705882353,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.3161764705882353,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.35947712418300654,\n \"acc_stderr\": 0.01941253924203216,\n \
\ \"acc_norm\": 0.35947712418300654,\n \"acc_norm_stderr\": 0.01941253924203216\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n\
\ \"acc_stderr\": 0.0472457740573157,\n \"acc_norm\": 0.41818181818181815,\n\
\ \"acc_norm_stderr\": 0.0472457740573157\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46122448979591835,\n \"acc_stderr\": 0.03191282052669278,\n\
\ \"acc_norm\": 0.46122448979591835,\n \"acc_norm_stderr\": 0.03191282052669278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5024875621890548,\n\
\ \"acc_stderr\": 0.03535490150137289,\n \"acc_norm\": 0.5024875621890548,\n\
\ \"acc_norm_stderr\": 0.03535490150137289\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.03664314777288085,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.03664314777288085\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4678362573099415,\n \"acc_stderr\": 0.038268824176603676,\n\
\ \"acc_norm\": 0.4678362573099415,\n \"acc_norm_stderr\": 0.038268824176603676\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.01476194517486267,\n \"mc2\": 0.37017660840801425,\n\
\ \"mc2_stderr\": 0.013722897185973262\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6566692975532754,\n \"acc_stderr\": 0.013344823185358004\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08794541319181198,\n \
\ \"acc_stderr\": 0.007801162197487713\n }\n}\n```"
repo_url: https://huggingface.co/sail/Sailor-4B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|arc:challenge|25_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|arc:challenge|25_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|gsm8k|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|gsm8k|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hellaswag|10_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hellaswag|10_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T22-38-33.484246.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T06-29-07.816855.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T06-29-07.816855.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- '**/details_harness|winogrande|5_2024-03-02T22-38-33.484246.parquet'
- split: 2024_03_03T06_29_07.816855
path:
- '**/details_harness|winogrande|5_2024-03-03T06-29-07.816855.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-03T06-29-07.816855.parquet'
- config_name: results
data_files:
- split: 2024_03_02T22_38_33.484246
path:
- results_2024-03-02T22-38-33.484246.parquet
- split: 2024_03_03T06_29_07.816855
path:
- results_2024-03-03T06-29-07.816855.parquet
- split: latest
path:
- results_2024-03-03T06-29-07.816855.parquet
---
# Dataset Card for Evaluation run of sail/Sailor-4B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sail/Sailor-4B](https://huggingface.co/sail/Sailor-4B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sail__Sailor-4B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-03T06:29:07.816855](https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-4B/blob/main/results_2024-03-03T06-29-07.816855.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.37703264298169736,
"acc_stderr": 0.03416862166048836,
"acc_norm": 0.38101337565531157,
"acc_norm_stderr": 0.034964297422117964,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.01476194517486267,
"mc2": 0.37017660840801425,
"mc2_stderr": 0.013722897185973262
},
"harness|arc:challenge|25": {
"acc": 0.4061433447098976,
"acc_stderr": 0.014351656690097858,
"acc_norm": 0.43856655290102387,
"acc_norm_stderr": 0.014500682618212864
},
"harness|hellaswag|10": {
"acc": 0.5020912168890659,
"acc_stderr": 0.004989737768749948,
"acc_norm": 0.6950806612228639,
"acc_norm_stderr": 0.004594323838650353
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3660377358490566,
"acc_stderr": 0.02964781353936525,
"acc_norm": 0.3660377358490566,
"acc_norm_stderr": 0.02964781353936525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3819444444444444,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.3819444444444444,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.03476599607516477,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.03476599607516477
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101737,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101737
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3586206896551724,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.3586206896551724,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047736,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047736
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.38387096774193546,
"acc_stderr": 0.027666182075539645,
"acc_norm": 0.38387096774193546,
"acc_norm_stderr": 0.027666182075539645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.030108330718011625,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.030108330718011625
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3696969696969697,
"acc_stderr": 0.03769430314512568,
"acc_norm": 0.3696969696969697,
"acc_norm_stderr": 0.03769430314512568
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.43434343434343436,
"acc_stderr": 0.03531505879359182,
"acc_norm": 0.43434343434343436,
"acc_norm_stderr": 0.03531505879359182
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.43005181347150256,
"acc_stderr": 0.035729543331448066,
"acc_norm": 0.43005181347150256,
"acc_norm_stderr": 0.035729543331448066
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33589743589743587,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.33589743589743587,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712177,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712177
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43669724770642204,
"acc_stderr": 0.021264820158714212,
"acc_norm": 0.43669724770642204,
"acc_norm_stderr": 0.021264820158714212
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.03275773486100999,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.03275773486100999
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4068627450980392,
"acc_stderr": 0.03447891136353382,
"acc_norm": 0.4068627450980392,
"acc_norm_stderr": 0.03447891136353382
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4092827004219409,
"acc_stderr": 0.032007041833595914,
"acc_norm": 0.4092827004219409,
"acc_norm_stderr": 0.032007041833595914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.47533632286995514,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.47533632286995514,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.37404580152671757,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.37404580152671757,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4793388429752066,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.4793388429752066,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04826217294139894,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04826217294139894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3496932515337423,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.3496932515337423,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258974,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5170940170940171,
"acc_stderr": 0.032736940493481824,
"acc_norm": 0.5170940170940171,
"acc_norm_stderr": 0.032736940493481824
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4661558109833972,
"acc_stderr": 0.017838956009136805,
"acc_norm": 0.4661558109833972,
"acc_norm_stderr": 0.017838956009136805
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.37283236994219654,
"acc_stderr": 0.026033890613576277,
"acc_norm": 0.37283236994219654,
"acc_norm_stderr": 0.026033890613576277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372428,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372428
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4150326797385621,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.4150326797385621,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4694533762057878,
"acc_stderr": 0.02834504586484068,
"acc_norm": 0.4694533762057878,
"acc_norm_stderr": 0.02834504586484068
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.404320987654321,
"acc_stderr": 0.02730662529732768,
"acc_norm": 0.404320987654321,
"acc_norm_stderr": 0.02730662529732768
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28748370273794005,
"acc_stderr": 0.011559337355708512,
"acc_norm": 0.28748370273794005,
"acc_norm_stderr": 0.011559337355708512
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3161764705882353,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.3161764705882353,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35947712418300654,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.35947712418300654,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.0472457740573157,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.0472457740573157
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46122448979591835,
"acc_stderr": 0.03191282052669278,
"acc_norm": 0.46122448979591835,
"acc_norm_stderr": 0.03191282052669278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5024875621890548,
"acc_stderr": 0.03535490150137289,
"acc_norm": 0.5024875621890548,
"acc_norm_stderr": 0.03535490150137289
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.03664314777288085,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.03664314777288085
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4678362573099415,
"acc_stderr": 0.038268824176603676,
"acc_norm": 0.4678362573099415,
"acc_norm_stderr": 0.038268824176603676
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.01476194517486267,
"mc2": 0.37017660840801425,
"mc2_stderr": 0.013722897185973262
},
"harness|winogrande|5": {
"acc": 0.6566692975532754,
"acc_stderr": 0.013344823185358004
},
"harness|gsm8k|5": {
"acc": 0.08794541319181198,
"acc_stderr": 0.007801162197487713
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_cot_v2-math-db74ac-2016866702 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_cot_v2
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-30b_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_cot_v2
dataset_config: mathemakitten--winobias_antistereotype_test_cot_v2
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-30b_eval
* Dataset: mathemakitten/winobias_antistereotype_test_cot_v2
* Config: mathemakitten--winobias_antistereotype_test_cot_v2
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
wwydmanski/biodataome | ---
license: afl-3.0
task_categories:
- tabular-classification
pretty_name: BioDataome
size_categories:
- n<1k
- 1K<n<10K
tags:
- biology
---
# BioDataome
This is an aggregate dataset which allows you to download any and all data from the [BioDataome project](http://dataome.mensxmachina.org/).
## What is BioDataome?
BioDataome is a collection of uniformly preprocessed and automatically annotated datasets for data-driven biology. The processed data can be accessed via the BioDataome website in .csv format and the BioDataome package via github. BioDataome package contains all the functions used to download, preprocess and annotate gene expression and methylation microarray data from Gene Expression Omnibus, as well as RNASeq data from recount.
## Usage
```python
import datasets
ds = datasets.load_dataset("wwydmanski/biodataome", "GSE24849")['train']
split_ds = ds.train_test_split(test_size=0.1)
train_ds, test_ds = split_ds['train'], split_ds['test']
# there is probably a better way to do this, but this seems to work the fastest
y_train = train_ds.to_pandas()['metadata'].apply(lambda x: x['class'])
X_train = pd.DataFrame.from_records(train_ds.to_pandas()['data'])
y_test = test_ds.to_pandas()['metadata'].apply(lambda x: x['class'])
X_test = pd.DataFrame.from_records(test_ds.to_pandas()['data'])
```
Please refer to the [original metadata](http://dataome.mensxmachina.org/) for the list of available datasets.
## Disclaimer
BioDataome and its content are provided as is without any warranty of any kind, that BioDataome or any documents available from this server will be error free. In no event will its members be liable for any damages, arising out of, resulting from, or in any way connected with the use of BioDataome or documents available from this server.
BioDataome is restricted to research and educational use. The information you may retrieve and recover from BioDataome is not designed to diagnose, prevent, or treat any condition or disease
Part of research that led to the development of BioDataome has received funding from the European Research Council under the European Union's Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement n. 617393.
Part of the analyses results and the implementation of the web interface were funded by the “ELIXIR-GR: Managing and Analysing Life Sciences Data (MIS: 5002780)” Project, co-financed by Greece and the European Union - European Regional Development Fund. |
OdiaGenAI/roleplay_hindi | ---
task_categories:
- question-answering
- conversational
language:
- hi
tags:
- code
- art
- finance
- architecture
- books
- astronomy
- acting
- accounting
size_categories:
- 1K<n<10K
---
The following dataset has been created using camel-ai, by passing various combinations of user and assistant. The dataset was translated to Hindi using OdiaGenAI English=>Indic translation app. |
nohansantos/Nohanvoice | ---
license: openrail
---
|
mask-distilled-one-sec-cv12/chunk_100 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1289981820
num_examples: 253335
download_size: 1317632489
dataset_size: 1289981820
---
# Dataset Card for "chunk_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lowres/Mikumo-Guynemer | ---
license: mit
task_categories:
- image-to-image
tags:
- art
---
# RAW DATASET OF MIKUMO GUYNEMER FROM MACROSS DELTA
## TODO: Parse data in reasonable format (file name extension, dimension, index, etc...) |
Kamyar-zeinalipour/Turkish_CW_V4 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 46327837
num_examples: 182395
- name: test
num_bytes: 1267576
num_examples: 5000
download_size: 11168267
dataset_size: 47595413
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
fmattera/lack-center-table | ---
license: openrail
---
|
giganticode/java-cmpx-v1 | ---
language:
- java
license:
- mit
multilinguality:
- monolingual
pretty_name:
- java-cmpx
size_categories:
- unknown
source_datasets: []
task_categories:
- text-classification
task_ids:
- multi-class-classification
--- |
AlderleyAI/squad_chat | ---
license: cc-by-sa-4.0
task_categories:
- question-answering
size_categories:
- 100K<n<1M
language:
- en
---
# Dataset Card for Squad_Chat
## Dataset Description
A data set for training LLMs on in-context or Document Question-Answering.
- Point of Contact: info@alderley.ai
### Dataset Summary
This dataset is an amended version of the SQuAD2.0 dataset, with the question responses amended to be more conversational in nature.
The SQuAD2.0 dataset combines the original set of 100,000 questions from SQuAD1.1 with an additional 50,000 unanswerable questions, crafted intentionally by crowdworkers to mimic the format and appearance of the answerable ones. This approach requires systems to not only deliver accurate answers when they exist, but also determine when the provided paragraph does not support an answer and accordingly refrain from responding.
Squad_Chat is unique in that the question responses are in a more conversational chat format. The objective of this transformation is to support the fine-tune training of large language models so that they perform well specifically with in context question-answer tasks.
### Supported Tasks and Leaderboards
Supported Task: In Context Question-Answer, Document Question-Answer tasks
## Dataset Structure
We provide both csv and jsonl files.
Questions that are unaswerable are NaNs in the csv, and have the string `"<no answer>"` in the jsonl
### Data Fields
The csv dataset has the following attributes (all strings):
id: Matches the original squad id
title: Matches the original squad title
context: Matches the original squad context
question: Matches the original question
answer: Conversational answer to question. (evolution of original squad answer)
The jsonl file only has:
context: Matches the original squad context
question: Matches the original question
answer: Conversational answer to question. (evolution of original squad answer)
### Data Splits
None, data is presented as a single file.
Note that in the original squad dataset, the dataset is split into Train and Validation sets. In Squad_Chat, both train data and validation data are combined into a single file.
## Dataset Creation
26th June 2023
### Curation Rationale
This data set is specifically to support the training of large language models for in-context question-answering or document question-answering. Small Instruct and chat trained LLMs struggle with this task and have a tendency to ignore the provided context when generating an output. This data set is designed to support the training of small LLMs that excel at this task.
### Source Data
Squad2
https://huggingface.co/datasets/squad_v2
#### Initial Data Collection and Normalization
This new answer data set was generated from the original squad_v2 data set over several days by querying gpt-3.5turbo with the following prompt...
```
system_intel = """For each of the 60 input data items, rewrite the given answer in a more conversational tone. Do not add additional information, just rephrase it. The input data items consist of an ID, a question, and an answer. Your task is to return a valid JSON object for each item, containing the original ID and your rephrased answer. Remember, the keys "id" and "answer" in your JSON object should be in double quotes (""). If any quotations appear in the answer, use single quotes ('').
For instance:
- If the input is: [0, 'In what country is Normandy located?', ' France'], the output should be: {"id" : 0, "answer": "Normandy is located in the country of France."}.
- If the input is: [2265, 'What is the Rankine cycle sometimes called?', 'a practical Carnot cycle'], the output should be: {"id" :2265, "answer": "The Rankine cycle is also sometimes known as a practical Carnot cycle."}.
- if the input is: [9524 \'What campaign did the Scottish National Party (SNP) run?\'\n \'The SNP ran with the campaign "Its Scotlands Oil".\'], the output should be : {"id": 9524, "answer": "The Scottish National Party (SNP) ran with the campaign \'Its Scotlands Oil\'."}
"""
prompt = f"Here is the list of 60 data items: {item_list}"
```
## Considerations for Using the Data
### Discussion of Biases
The data is only in English.
There is a 2/3 : 1/3 split of answered to not answered questions.
### Contributions
Alderley.ai |
bartoszmaj/nouns_three | ---
dataset_info:
features:
- name: nouns
sequence: string
splits:
- name: train
num_bytes: 234004224
num_examples: 1000000
download_size: 67575385
dataset_size: 234004224
---
# Dataset Card for "nouns_three"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_dotvignesh__perry-7b | ---
pretty_name: Evaluation run of dotvignesh/perry-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dotvignesh/perry-7b](https://huggingface.co/dotvignesh/perry-7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dotvignesh__perry-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T10:51:37.935635](https://huggingface.co/datasets/open-llm-leaderboard/details_dotvignesh__perry-7b/blob/main/results_2023-10-23T10-51-37.935635.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n\
\ \"em_stderr\": 0.0002964962989801269,\n \"f1\": 0.05790478187919471,\n\
\ \"f1_stderr\": 0.0013248182101283533,\n \"acc\": 0.41422192675444136,\n\
\ \"acc_stderr\": 0.0104604764963125\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801269,\n\
\ \"f1\": 0.05790478187919471,\n \"f1_stderr\": 0.0013248182101283533\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10310841546626232,\n \
\ \"acc_stderr\": 0.008376436987507811\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7253354380426204,\n \"acc_stderr\": 0.012544516005117188\n\
\ }\n}\n```"
repo_url: https://huggingface.co/dotvignesh/perry-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T10_51_37.935635
path:
- '**/details_harness|drop|3_2023-10-23T10-51-37.935635.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T10-51-37.935635.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T10_51_37.935635
path:
- '**/details_harness|gsm8k|5_2023-10-23T10-51-37.935635.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T10-51-37.935635.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T10_51_37.935635
path:
- '**/details_harness|winogrande|5_2023-10-23T10-51-37.935635.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T10-51-37.935635.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- results_2023-10-04T00-15-19.939384.parquet
- split: 2023_10_23T10_51_37.935635
path:
- results_2023-10-23T10-51-37.935635.parquet
- split: latest
path:
- results_2023-10-23T10-51-37.935635.parquet
---
# Dataset Card for Evaluation run of dotvignesh/perry-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dotvignesh/perry-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dotvignesh/perry-7b](https://huggingface.co/dotvignesh/perry-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dotvignesh__perry-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T10:51:37.935635](https://huggingface.co/datasets/open-llm-leaderboard/details_dotvignesh__perry-7b/blob/main/results_2023-10-23T10-51-37.935635.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801269,
"f1": 0.05790478187919471,
"f1_stderr": 0.0013248182101283533,
"acc": 0.41422192675444136,
"acc_stderr": 0.0104604764963125
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801269,
"f1": 0.05790478187919471,
"f1_stderr": 0.0013248182101283533
},
"harness|gsm8k|5": {
"acc": 0.10310841546626232,
"acc_stderr": 0.008376436987507811
},
"harness|winogrande|5": {
"acc": 0.7253354380426204,
"acc_stderr": 0.012544516005117188
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
apollo-research/sae-monology-pile-uncopyrighted-tokenizer-gpt2 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 39285716200.0
num_examples: 9581882
download_size: 16728794109
dataset_size: 39285716200.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
eswardivi/telugu_dataset | ---
dataset_info:
- config_name: telugu_asr
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 47887486
num_examples: 209270
download_size: 20219871
dataset_size: 47887486
- config_name: telugu_nlp
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 387671180
num_examples: 47415
download_size: 150012515
dataset_size: 387671180
- config_name: wikipedia
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 710613522
num_examples: 87854
download_size: 209754217
dataset_size: 710613522
configs:
- config_name: telugu_asr
data_files:
- split: train
path: telugu_asr/train-*
- config_name: telugu_nlp
data_files:
- split: train
path: telugu_nlp/train-*
- config_name: wikipedia
data_files:
- split: train
path: wikipedia/train-*
---
# Dataset
This repository contains the final dataset created using various resources. The primary datasets used for the construction of this final dataset are:
- [Telugu NLP Dataset from Kaggle](https://www.kaggle.com/datasets/sudalairajkumar/telugu-nlp)
- [Telugu ASR Corpus from HuggingFace](https://huggingface.co/datasets/parambharat/telugu_asr_corpus)
- [Wikipedia Telugu Dataset from Wikimedia on HuggingFace](https://huggingface.co/datasets/wikimedia/wikipedia)
These datasets have been combined to form a comprehensive resource for Telugu Natural Language Processing (NLP) tasks.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.